[MUSIC PLAYING] RAM RAMANATHAN: Welcome, everybody My name is Ram Ramanathan And with me is Rajiv Salimath from PricewaterhouseCoopers What we want to spend the next hour or so is talk to you guys about how you can use machine learning to kind of change common retail processes Have you ever wondered how we can enable customers to find the most relevant and the right product at the right time or how you can enable them to search easier than just doing the data textual query? Like, that people are more and more visual, how do we enable them to search better? And that’s really what we wanted to spend some time over the next hour talking about how we can apply ML for common scenarios in retail Here is a lovely diagram of showing how we do a deep learning neural net The rest of the session is going to be talking about convolution neural nets and spending a lot of time in deep learning And the goal for it is, of course, to detect whether an image is a cat or a dog, which is great for my vision scenario, but what does that have to do with retail? The fact that you can find a cat or a dog is interesting– I love dogs– but does it really help you guys on a day to day basis? And that’s really what we want to do is spend some time thinking about common ML scenarios that you can think of in your customers among your day to day use and how you can apply that We think of three key scenarios where machine learning comes to play One is enabling your customers or your users to find the right product at the right time, right? How do we take people, moving them away from just regular textual searches to thinking about finding products more visually? The other aspect is how do we help them recommend the right product at the right time, right? If you really think about this, one of the things that we deal with more and more is choice And the thing is we can help customers is, how do you give them the right product at the right time based on their prior use behavior? And the third piece is, once you have released a product out into the wild, how is this product doing? What are people talking about it? What else are they referring to when they use this product? So you guys can get a better feeling of sentiment, understanding what they do with the product, and get a better answer that can then drive your life cycle from a product development perspective, marketing perspective, and multiple downstream activities So I’m going to talk about a couple, three scenarios So let’s talk about why finding their favorite product How do we do that today? Today, most retail stores, we normally think about their online web presence It’s normal, you have a textual box They do a product search query And people normally kind of gravitate to search first Part of it is because of their behavior that they do every day when they open up a Chrome browser and they search in the tab And they kind of got indoctrinated to search first So in those kind of scenarios, how was your search experience? And so far, searches have been mostly textual But one of the things that we’ve seen more and more is people are moving more and more to images and video to kind of communicate, right? Whether it’s messaging applications, whether it’s people pinning their favorite images on Pinterest or on their favorite social app Or even if you think about scanning through blogs, get inspiration to what they want to purchase Once they do get the inspiration, what do they do with that? They can’t really find their product at their favorite retailer anymore, because it’s an image, most probably So how do we enable those scenarios? How do you enable your consumers to discover products easier? That’s one aspect The second piece is how do we think about recommending the right product to the right customer? I think one of the use cases when we think about Google and recommendation is we have over a billion hours of video content watched every day on YouTube And you may say, OK, what’s that got a recommendation? The reason people watch so much video on YouTube is because they see this Related Video tab on the lower right corner Based on your user behavior, based on your product metadata, we then serve up appropriate videos that’s relevant for your context Not just the clickbait, meaning that if you watch a cat video, show more cat videos, but really focus on what’s relevant for that end user And thinking about time to view and saying, is this is a video you really want to look at? That’s a case of a content perspective But you can then translate that to products on a day-to-day basis Just because somebody bought something last year does not mean they’re going to buy something this year, because maybe externalities like seasonality There may be social events Because one of their favorite teams won their sports event, that may drive behavior There may be other things that are more internal factors that drive recommendations, [INAUDIBLE] out of stock, et cetera And how do you think about taking all of that to drive the right product recommendation for end customer? And the third piece, as I talked about, is sentiment, right? Really now that you release a product, it’s out in the wild, people are doing all kinds of craziness with your product But then what’s going on with the product? Do people actually like it, right? I mean, it’s funny enough I am a frequent customer of both Google Shopping and at Amazon And I spend lot of time reading those reviews Those reviews clearly influence what I purchase And so thinking about for USA manufacturer or retailer, you really need to understand what are people saying about this product out in the wild

And you really want to understand, what else are they talking when they talk about this product? What are the other entities? Who’s the most perceived substitute in the customer’s mind? What else would they normally talk about from a comparative perspective when they talk about your product? How can we help you understand that? And that’s normally, all that data is not necessarily inside your organization A lot of data is outside your organization in social media How do you then harness all that data to get insights into your recommendation? So that’s really what we’re going to spend the rest of the time about these three scenarios– finding products, talking about recommendations, talking about sentiments, and how do we start driving scenarios Here is a quick view of all the different services that we provide on Google Cloud Platform to really think about that And what I want to spend some time with today– how do we bring these services together for one or two scenarios? From a machine learning perspective, in general, there are two key ways that we provide access to machine learning One is as a platform play So that’s really where we have clearly– we’re going to be talking more about TensorFlow tomorrow, where we think about TensorFlow as the number one open source machine learning platform But any of you can build your own cloud models on your data, run it anywhere And then when you want, when you’re ready to run it at scale in a managed environment, you can bring it to our cloud And we have cloud machine learning engine that helps you really drive your machine learning models, both from a training perspective or prediction perspective And that’s really targeted at data scientists and folks who want to really manage the machine learning life cycle from end-to-end perspective At the other end is really what we talk about machine learning APIs So these are APIs that’s really focused around using Google models, using Google infrastructure, and providing you targeted solutions for scenarios For example, our own language to help you do things like finding named entities, meaning entities in a block of text, or finding things like sentiment Whether it’s an image– we released Vision API last year So how do you find common entities? And in the case of video, we just announced yesterday So really, that’s where we’re using Google models to try to get insight into unstructured data And we also have other APIs like Jobs API, where we’re focused on very targeted vertical use cases, like thinking about job placement And that’s really kind of the portfolio of machine learning products that we have today from Google Platform, on one hand, based on TensorFlow with Cloud ML Engine and a set of ready-to-use APIs on the ML API side And the goal for this talk is talking about how can we use some of this technology to drive, to go back to those three scenarios we talked about? How do you find your products easier? How do you help drive recommendations? And the third aspect is, how do you help customers understand sentiment? So let’s first focus on the first one How do you help find the right product? So there are a couple of things And the scenario that I’m going to focus on is really around searches Search by image As you can imagine, there are lots of other pieces that you can do from a find the right product There are things on search engine optimization There are things about how you can tune your existing search index and tuning better using [INAUDIBLE] search But in my mind, those are technologies that have already been in the market for a few years It’s kind of a well-understood process of how you think about search engine optimization How do you drive searches across your metadata today? People have solutions today So what I want to spend some time thinking about is, how do we think about finding the right product? And that there are a couple of things you can go around the thing One is, when a user of your website sends an image as part of the search box, how can I get metadata on that image? What is that image about? So I can drive a search process The other aspect is thinking about logos Is there any product branding information that I can take to drive certain aspects? And then how do I think about fraud If I’m a marketplace and different people are posting, sellers and buyers, sellers are posting products that they think they’re ready to sell, how do you know, as a marketplace owner, that this is the right product or not? Is somebody just doing a phishing exercise, where they’re taking some random stock image, posting it on the marketplace? Is this a valid image? And so we’re just going to talk to a couple of ML techniques that enable that scenario The first piece is image metadata This is something we released last year It really works in more generalized scenarios So that the information you get back from what we used to call a Vision API is more generic I would say, a higher level of categorization information It’s not very targeted, specific information But it can give you things like it’s a leather handbag and give you the color of the handbag And that can give you metadata to then drive on subsequent search So for example, one of our customers, they may be enabling a search by image feature in their app They want to start driving things like, hey, I want to search for, say, brown canvas bag So we’ll give the color We’ll give you the term it’s a bag And potentially, if it’s a canvas bag And then you can then drive a search index based on that within the metadata they collected already

Another piece we talked about is, how do we think about using the web for fraud and topical entities? What that is is we just released our new feature called Web Detection So that’s really the functionality under Google Image Search So what that gives you is if an image exists on the web, right? So that’s important point If an image exists on the web And our web crawler, as you can imagine, we crawl through billions of pages on a daily basis We index a lot of these images We can then find metadata about that image and feed that back So when you post an image to us, we can then tell you, this metadata on this image based on lots of different pages that we scan, is this means A, B, C Does it make sense? And we can also tell you if a similar image exists somewhere else on the web So that’s another aspect that we provide So really, the scenario we think about is if it’s topical entities, where it’s now here, for example, you want to find a product with a very famous logo, we can pick up that character, like a Marvel character or a character from a BBC show that’s very popular in the web We most probably would pick up that character from an image And the other aspect is also is for fraudulent scenarios, where if you’re a marketplace, you want to see, are these people posting fraudulent images or stock images on the marketplace? All right, so we’ll all do it out of the box And I’ll show you some examples of that But be careful about this Because we use the web metadata, we can get a lot of false positives too So for an example of this, this is clearly a Stormtrooper image, but we also give you a metadata called pregnancy And you’re like, why did you get that? So if you click on this link, it actually goes to a Pinterest page And somebody there has used this image to drive a baby shower, right? And so that’s [INAUDIBLE] So very clearly, it’s a very popular pinned page And so a lot of pages are linked to it And that drives our ranking And it clearly was on this image of a Stormtrooper helmet It’s now linked to pregnancy So you just need to think through I just want to caution you, this is not a antidote to all problems It works well for many scenarios, but also doesn’t work well for some scenarios So that’s a piece that I wanted to leave you guys with So let’s kind of walk through how you want to do that Can you switch to a demo? Perfect So let’s start off with this I’m a big Star Wars junkie So in case you guys see a lot of things [INAUDIBLE] So it’s not that Disney is endorsing me or anything like this It’s just that I like “Star Wars” stuff So here’s an example where, for example, we get all this information based on this image It’s LEGO It’s Star Wars It’s Lego Star Wars So it’s information about the product And if you go back to the prior image where we talked about the [INAUDIBLE] bag So let me go back to that scenario So here’s another example where, in this case, it’s a mug Again, I’m a huge Star Wars junkie So in case you’re wondering why all my products are all at some relation to Star Wars, it’s just because “Rogue One” is an awesome movie I think I’m not wired on the internet, because I think you guys– [INAUDIBLE] So let’s go back and start this again So for example, in the image that I just showed you earlier in the thing, so in this case, it brings up a leather bag And here’s the metadata that we give you And this is the other additional metadata that you get from the web And so you can imagine how you can start using this metadata to drive the scenarios Again, I just wanted to reiterate, for the web metadata, it really depends if a similar image exists on the web So if it’s an object that’s not present on the external web, you most probably will not get any metadata on it I talked a lot about products But there are also other cases where people are thinking about furniture shopping People are thinking about other scenarios And here’s a couple of examples of that So in this case, I basically pinned up, grabbed a bunch of images from– for example, you have a for, potentially, a furniture store,

people are coming and loading images They want to find similar stuff And you can imagine where you get metadata on those pieces or the box We give you additional metadata that you can now use to drive potential additional searches downstream I’ll just do one more image And then I’ll go back to the “Rogue One.” And the key aspect is here, for example, we recognize it’s “Rogue One,” Star Wars, Jyn Erso, et cetera, based on this cup And then the other piece that we also give you is information– where else does this image exist? So you can imagine, we give you this additional information of all the stores where this image shows up at So that’s another essential piece We think about a marketplace People have uploaded a bunch of images You guys want to do a quick fraud check to see what if these images are stock images or they come from a valid place, you can very easily run those images through the API and see if these are valid images or not and then use the subsequent metadata to drive searches Back to slide So let’s move on to the next use case We’re talking about recommendation So in the case of recommendation, there’s a lot of different scenarios that we want to talk about Really, you can think about it drives a lot of different scenarios all the way from thinking about how to view the right content in YouTube to thinking about how you can write appropriate cross-sell or upsell for the customer But as I mentioned, there’s a lot of different factors that drive into the recommendation aspect So you really want to think about, what’s the history of the user behavior? What has a user been doing on your site? Or what purchases have they done with your store prior to that? What’s the current product metadata? That’s really where, if you can imagine, if your product is more of a physical device, your product is more of a soft device, whether it’s a video or it’s a piece of content, you really want to get that metadata, so you can drive things downstream around recommendation models And Rajiv’s going to talk a little bit about some of the models that enable that We provide you a rich platform to enable that And so right now, what we use is TensorFlow So [INAUDIBLE] really when you think about machine learning, there is a pipeline for that There’s a pipeline for really thinking about training, how you think about training this model, where you take your data, the data gets the data in GCS, in Google Cloud Storage You feed it through a data flow pipeline to process the data to get the right features that you want Then you get a TensorFlow model that you have gone through training, that you look at the model parameters, tweak it appropriately, and then the model goes into production, which then drives downstream processes So this is really where TensorFlow and things like cloud ML Engine comes into play to really drive this into a model scenario But then you can imagine, the pipeline doesn’t just sit with machine learning The pipeline is end to end The pipeline goes all the way from how you think about your logs data, your clickstream behavior coming in from– whether it’s cloud Pub/Sub, whether it’s metadata flowing through things like Cloud Storage And the end result is really thinking about all your analysts who have to do things around hey, help me understand my prior history, my sales behavior, my purchase history And thinking about how to then figure out all the BI and data warehousing stuff that’s been going on for 10, 15 years All those things does not go away with machine learning You can imagine, you need that to drive machine learning If you don’t have that, go back and fix your BI and data warehouses Because without data, there is no ML, right? So the dirty secret of machine learning is 90% of the work is getting your data into your data warehouse, cleansed, processed, so you can drive downstream machine learning processes So we shouldn’t forget that piece Think of it as a tip of the spear, right? It’s a shiny part, but there’s a lot of work that happens to get that data to the model for training So this kind of scenario kind of talks about the end-to-end pipeline, not just from how do we feed data, whether think about clickstream behavior, these are lots of events that goes across all your servers, feeding through a cloud Pub/Sub pipeline that’s integrated with data flow that does your processing, then dumps your data into BigQuery where your analyst most probably will run a bunch of queries and analysis on that with your favorite BI tool, whether that’s Tableau And then once you’ve kind of done your data processing, what they call the data janitorial work, where you figured out what’s the right data, what’s the right features, you then do subsequent processing, feed that into a format that’s compatible with TensorFlow, create your cloud machine learning model, create a TensorFlow model, and then you do some appropriate level of evaluation, then put it in production, most probably not directly in production It will have to do things like an A/B test process We figure out if this the right model Am I getting the results I want in actual production before you roll it out, right? There’s a lot of processes that happen before user behavior data goes to an actual ML model What I want to hand it off to Rajiv is to talk about how someone actually did this in production And Rajiv is going to walk through a use case where he’s helped the customer walk through the scenario end to end RAJIV SALIMATH: Thanks, Ram

I’m juggling lots of devices And I hope all of them work OK with Wi-Fi and the screen resolution Great So I do machine learning stuff at PwC And we do a lot of the dirty work that Ram was talking about, taking data, taking historic data And across our clients, we see some clients have a lot of data, but they have security issues about what can be used, what cannot be used Some clients don’t have data And some clients just don’t know what type of data they have And we take that data and we figure out how to put it in the right context And we’ve built several training models on TensorFlow And I’ll talk about one of them, which is for a company called Sky, which is a large television package provider in Mexico It’s kind of like the Verizon or AT&T of Mexico So the challenge here is that they literally have door-to-door salesmen who go and try to sell you different television packages And over the years, they’ve got lots of sales data in terms of what television package was sold to which area, but they don’t have any data in knowing which type of people are interested in sports packages, which type of people are interested in HBO, who wants to watch reality TV, or anything like that So what we created for them is we built simple tablet applications, retail kiosks for them So users can interact with these kiosks in a fun way and they can just pick a few movies or TV shows that they like And based on that, the machine learning code figures out what type of television package and what kind of hardware, along with that, that person would be best suited for This is a problem where you have to try and figure out what a user wants with very little data In other sessions, we’ve spoken about what to do with a lot of data And we built this on a cloud platform Essentially, what we have is the kiosk screen And all the app code is driven on App Engine And we basically have the input data coming when a user selects different TV shows that he or she likes, and that goes into a single-layered neural network that we’ve built and deployed on TensorFlow And I’ll show you some screens and graphs from that as well And based on that, the TensorFlow figures out what products should be recommended for that person Essentially, the math challenge here is, how do you do good filtering for a user without having a lot of data about that user? There’s a lot of different types of filtering you can do Essentially, they all have fancy names But basically, it comes down to, do you want to filter stuff based on what this user has done in the past or based on if I like to watch HBO, somebody else likes to watch HBO, and that somebody else also likes to watch Comedy Central, then what’s the likelihood I would like to watch Comedy Central as well? And there’s different types of filtering techniques like that We’ve picked a collaborative filtering technique here What it basically means is if you have all your users on one axis and you’ve all your shows or TV packages on the other axis, you have some data for users for some of the packages and you have holes in places And so the whole mathematical challenge is, how do you fill the holes? How do you know that if I’ve picked A, B, and C, then what’s the likelihood I’d be interested in D as well? And the reason we use TensorFlow for this is our initial training data had something like 20,000 users And on the other axis, we had some data for these users on about 300 different types of TV channels So essentially, it was like a 20,000 by 300 matrix that we were executing But that pretty soon exploded into having over two million users and something like 20,000 data points And so that becomes a very hard computational problem And that’s where you can easily tweak your models in TensorFlow to scale with the amount of data that you get

So I’ll try to show you some of these things Do you which one to plug into the tablet? Can I have the tablet screen, please? There you go It’s going to be a little psychedelic, because for some reason, I can’t get it to work in the other resolution But think of this as your afternoon neck exercise But you could also drop by at the PwC booth And it works in a less psychedelic way there So let’s tap on this We call this the TV Show Gobbler So it first has a little image recognition thing It recognizes me But I always make this joke that all machine learning, the goal is to detect Indian names properly And it still detects Rajiv as Ratchet But we’ll go with that for now Let’s play So imagine this as a kiosk And it shows you a bunch of TV shows And you select a bunch of things that you like And it’s going to try to figure out what your interests are based on this Definitely, “Halt and Catch Fire,” “Westworld.” And if you see the bottom tab, it already detected– from a very little amount of data, it detected what I like Obviously, the more number of things that I pick– got to pick “The Wire”– the more number of things that you pick, the more it learns about you And so let’s see what it kind of figured out about me So it figures out a bunch of interest tags, a bunch of things based on facial recognition It tries to estimate my age And then it looks at, you know, I like the show “Halt and Catch Fire.” How many other people have liked it? In that age demographic, how many people like it? And it executes that tensor computation And I’ll show you that And based on that, it selects these different– the ones which are white are not things that are for me And the ones that are in color are what the algorithm thinks I’d be interested in And I’m so happy it didn’t pick reality TV in front of all of you So let’s see what it recommends based on that So based on this now, our customer Sky can figure out what type of bundle of TV package they can sell to me And these bundles have different channels And they’re at a certain price They have a certain amount of hardware with it And if I select one of these bundles And then you can have other related products to it And this also has image recognition But if I place a product in front of it, it’ll detect the product It’s trying to do it right now, but I don’t have a product to place in front of it But any product, whether it’s a bar code that it could have or a product, you put a Google Home or a Daydreamer or something in front it, it’ll pick it up It’ll find the right price It’ll allow you to add it to your queue, contact the sales rep, all kinds of things like that So if you come over to the PwC kiosk at some point, we’ll show you all these things in a better screen resolution If I can have the laptop again, please Can I have that link for the TensorBoard? Thanks a lot RAM RAMANATHAN: Could we move to the demo box? RAJIV SALIMATH: Great So I’m not sure how many of you know a lot about this and are going to grill me about that, how many of you don’t know much at all about TensorFlow So I’ll briefly go over some things So essentially, TensorFlow is a computation engine, right? And you give it some model and it’s going to execute it for you And how you design that model, which in this case, happens to be a single-layer neural network And when you feed it, so you see this thing called Softmax here So that’s kind of the function that we use And that decides kind of the bias and the weight So if you think about all the users on one axis and all

the interests on another axis, let’s say, hypothetically, it’s a 20,000-by-200 matrix, right? And out of that, that’s my input tensor And that has to give me an output that tells me for each of these 20,000 users, which one of the 10 television packages they can sell to me So the 20,000-by-200 input training tensor should give me an output of 20,000 by 10, essentially And to do that, essentially, you have to do some kind of– if you were to do it without something like TensorFlow, you’d sit and compute different permutations and combinations of the weights that you have to put into this and the bias that you have to put into this So the beauty of TensorFlow is that it allows you to define a function and play with the bias and the weight in rapid time till you can get your input model to a level where it looks like you’re getting the right outputs from the training set So when you do that, when you have that model, and then if you look at the distribution, essentially, over time, it has to kind of converge, so you know that the bias that you’ve given or the weight that you’re put into your weighted algorithm, it’s kind of coming to a sort of a convergence, so you know how good or bad your training model is And I’ll show you one last thing maybe So if you’re doing anything with TensorFlow and you’ve always got somebody yelling at you, saying, how do I know how good or bad this is? How do I know how accurate this is? And you kind of have to look at a loss curve over time to kind of say, hey, you know, it has to go down a little bit over time, even if it doesn’t go down this much If it’s not going down and that last curve is not reducing, then the variance is too high and you have to go back and kind of tweak your model some more So I just wanted to show you a couple screens And TensorFlow makes it really easy to do those things on TensorBoard and execute the functions on TensorFlow instead of having to kind of go through lots of permutations of iterations for those input functions So that’s what we did with one of our clients, Sky And I’ll give it back to Ram RAM RAMANATHAN: So that was a quick aspect of [INAUDIBLE] Going back to the story we were trying to talk about Help you help customers find the right product Then second aspect is from your perspective, recommend the right and relevant product at the right time And then the third piece that we want to talk about is a little bit about, how do we go back to understanding product reception, right? How do we think about what’s going on in the field? How is this product getting received? What are customers talking about it? How do we understand product reviews, et cetera? And that’s really what I wanted to spend some time to address that third scenario and talk a little bit about how we can just really focus on natural language [? understanding. ?] A lot of the scenarios, we’re going to talk about that aspect There’s a lot of other pieces that goes into kind of getting product reception You can imagine aspects where you think about reference for your product There are other pieces, like, if you start thinking about social media, and you think about the images that flow through social media And you want to start associating, hey, how’s my product logo perceived in social media? What are people doing normally with my product? That’s another aspect that we can think about, logo detection and image processing to enable those scenarios But the other piece, which is, I think, a lot more, I would say, fundamental is product reviews, right? You get this product reviews galore There are things that people are talking about on social media What are people saying about my product? There are product reviews coming from magazines, from publishers Find my product Tell me what’s going on Tell me what’s relevant for my product What else are they talking about? And the other aspects is when you think about, when you look at a review, things that you want to understand in the review is, what are the entities they’re covering in my review, right? For example, when they talk about my product, what other product are they talking about? What other competitor are they talking about? What else are they associating my product with? That’s really the thing I just talked about For example, suggest phones For example, if I’m selling a new model phone, what else do people normally think of when I talk about my phone? And the third part is, what’s the sentiment? Was it positive or negative? So you can then drive those pieces Further downstream pieces, you can imagine, are things like, hey, what are the most positive [INAUDIBLE] features, what are the most negative features, et cetera,

to drive product reception? There are lots of other pieces that come into this, you can imagine, product telemetry If you’re electronic products, a key aspect to drive future product planning But in this piece is what I’m going to be talking about really around, how can we use language understanding to see what the external world is talking about you? So if you switch back to the demo So the data I’m going to focus on is the Nexus 6P So basically, I’m going to start off with Nexus 6P and say, here is a product review I picked a site It’s on Trusted Reviews And they focus a little aspects of how this phone is received And what I’m going to do with that is actually use that product review in the context of our Natural Language API So Natural Language API offers things like named entity recognition It recognizes the most relevant entities in a block of text So it finds things like around organization It finds things like people It finds things like consumer goods out of the box And most common [? nouns ?] primarily where these entities are normally linked to a URL on Wikipedia So you find if it’s an entity that’s in Wikipedia, most probably will pick it up from a Natural Language API So let me go ahead and grab this piece of text I hate doing a live demo sometimes Because I actually had copied it, but then mechanical problems So let’s go to analyze it So one of the things that I wanted to show that is the first thing it picks up is if you look at this concept called salience, so first thing is, it’s scanned this article And it’s picked up that this article is really about the Nexus 6P, right? That’s the first thing you really want to understand Hey, you’re going to get a lot of text coming into your organization What is this article about? So that’s the first thing it finds out about The other thing you want to think about is things that’s really focused One of the key value points that Google really want to land was premium, right? That’s really, we focus on Nexus 6P as a premium consumer mobile device And again, the good thing is this article actually brought that up into [INAUDIBLE] flow The other aspects we wanted to talk about is, are there are pieces around design, salience, other pieces that– are there other consumer goods that’s normally referred to with the Nexus 6P? In this case, for example, we pick up the fact that people also talk about Samsung Galaxy S7 in the same article So you can imagine those are substitutes from a customer perspective We think about, how do we bring those two products together? Let me do another example In this case, I brought up the Pixel C So I’m going to go ahead and copy that piece of text here So in this piece, again, the key pieces that you want to think about is, so what do we want to understand? We want to understand what is this article about? Second thing we want to understand is, what other organizations normally is referred to when we talk about this product? We bring up aspects, like Samsung We think about other products that goes with that And then the other thing that I want to also start thinking about is, is this about a tablet? We figure out it’s a consumer good out of the box You can imagine, we start picking up salient entities out of this So the piece that I also want to then drive is, the next piece I want to talk about is, how do we think about the sentiment of this article? So I’m going to bring up an article about Google Home Sorry I know watching me copy text is not the most fun thing to do So a couple of things In this case, what we’ve done is look at the sentiment analysis using our Natural Language API So basically, Natural Language API, when you take a block of text, you run it through ML API, you get two kinds of scores on that You get a document sentiment And the two pieces you get is whether the score from minus 1 to plus 1

Minus 1 being negative Plus one being positive And then a magnitude of how positive that article was So in this case, you would think this article is actually moderately pretty positive And then it gives you things like document sentiment It kind of walks you through how the article itself kind of flows So as you guys start thinking about taking this block of text, if you think about product reviews coming in, you want to start scanning all these articles, start figuring out which article refers to my product And then the second thing you want to start thinking about is, are these articles positive or negative on my product? And the third part that I’m not covering today is, you can also imagine that you want to start looking at which features are positive and which features are negative And that’s some part we’re not going to cover So in the end, what can you do with that? So you can imagine things like relationship graph, right? So we’ve done is we basically ran every Wikipedia article through our named entity recognition And we basically started mapping a relationship graph based on which entities are common together in an article So this line basically shows there are a lot of articles where, for example, Nexus 6P and Motorola show up in the same article It’s a ticker line and it’s a shorter line And the case, for example, Nexus 6P and Sony, [INAUDIBLE] a few articles with the right level things, a much longer line and it’s a thinner line So it’s kind of a very crude form of a relationship graph So you can now thinking about, how do all these products fit together? What’s the relationship with the product? You can even imagine, you can start adding sentiment to that line Say that this sentiment is, normally, you got strong sentiments, positive sentiments based on reviews So you can get a better feel as you start looking from a product perspective, from the sales perspective, how the relationships drive certain behavior Does it make sense? And of course, that’s great from a product perspective But what I want to end with is something that’s more important, something for– who is Tywin Lannister related to in “Game of Thrones?” Because I’m a big “Game of Thrones” fan So you can imagine, you can do something similar for “Game of Thrones” thing And you can see right now the Tywin Lannister is connected to Tyrion Lannister, even though it should be a much stronger relationship How many of you are “Game of Thrones” fans? Not that many So this joke just went right over your head So coming back Slides, please So what we went through right now is, again, circling back to what we talked about We talked about the three key scenarios that machine learning can enable One is, how do we find the right product, right? How do we provide a more intelligent search process within retail sites? Thinking about image metadata Primarily, in this case, is search by image The second piece we talked about is recommendation And Rajiv walked through a scenario about how do you enable sales folks to recommend the right product to customers? And the third part, we just talked about, is understanding feedback based on products and understand how are people talking about your product, what else are they talking about, so you can start building relationship grabs with appropriate level of sentiment So that’s really what I wanted to cover in this session I wanted to leave time for more questions for you guys for the next 10 minutes or so There are lots of other [? share ?] sessions that cover going deep more into machine learning Some of them may already be done So when we did the slide, you can imagine, a lot of sessions happened before I think there’s one tomorrow on the data life cycle, which I highly recommend Also, and there’s things are on how to use TensorFlow with a lot of the things that Rajiv talked about There’s a good session around, how you do TensorFlow on cloud ML without a PhD? I think those videos are up on YouTube So those are sessions that, if you guys are really more interested in TensorFlow, I would highly recommend Some of the aspects around ML APIs We have a session that covers, basically, how we do all the different ML side APIs Kind of does a walk-through that Sara Robinson did earlier today Those videos are also up on YouTube So with that, Rajiv, I’d like to invite Rajiv up and open the door to questions