[MUSIC PLAYING] ADAM MICHELSON: Thanks for joining our session I’m Adam Michelson I am a product manager here at Google on the IoT team Joining me shortly will be a couple of folks from our partner, Cognite, Geir Engdahl and Sindre Hammerlund So they’ll be coming up in a minute and describing, in detail, their expertise around IoT and industrial But before we dive in on that, I wanted to just do a quick informal survey So for the folks in the room, around IoT, Internet of Things, how many folks in the room have deployed an IoT implementation, just by show of hands, or are planning to do so say, in the next 12 months? Wow, that’s a lot of people So that’s exciting to hear What we’ve seen from analysts is, there’s basically a claim that somewhere between 8 billion and 14 billion IoT devices exist out there today And that’s a lot of devices It’s also a big range, 8 to 14 billion I think one of the reasons might be, when you think about IoT, a lot of the implementations we see use a gateway, right? Like, If you think of this room, and if you want to control all the lights via IoT, then a lot of the implementations we see is, there’ll be a central box somewhere, some Linux machine, that all these lights are talking to on-premises, and then that gateway will connect to a cloud So what is it that is the counting as the 8 billion? Is it the one gateway or is it each individual light? I don’t know I feel sorry for the poor analyst who’s going around counting those 8 billion devices But it’s a lot of devices, more and more all the time And all of those devices are generating incredible amounts of data And dealing with the intake of all that data is one of the challenges when it comes to IoT Even though there are so many devices that are online, there’s another statistic which isn’t so great, and I don’t know what the experience of the audience is, that we’ve seen that people claim that around 1% of the data that’s actually collected for IoT is actually used to generate some sort of a positive outcome, business outcome And so what’s happening to that other 99% of the data that is not generating that outcome? There is an opportunity there for everyone to say, how are we going to collect this data and make sure that the data we’re collecting has a maximal value So that’s one of the purposes, where we can see Google’s mission come in So Google’s mission is to organize the world’s information and make it universally useful So when it comes to, say, Google search, that’s easy to think about Pulling up information from lots of websites, organizing it so that it makes sense, providing a nice, handy search bar that you can type into, and then at the end, giving you useful results So you tie that chain together IoT has the same opportunity, where we’re collecting all this data It needs to be organized in a uniform way, and then made useful So we think about the lights in the room are collecting one type of IoT data, or telemetric data The HVAC system might be another one The alarm systems, another one How is all this data being collected, organized to make sense, and then produce a useful outcome that has business value So some of that is some of the topics we want to talk about today But before diving right into the industrial use cases, I just want to spend a couple of minutes and baseline us on what are Google’s IoT offerings, so we know what the terms are and all have a common understanding before we dive deep into the industrial use cases So this is an image of Google’s IoT platform For those who haven’t seen it, it’s called IoT Core IoT Core is a component that exists within Google Cloud, Google Cloud Services IoT Core has a few major components built into it One component is this protocol bridge That is what will enable us to take these edge devices and safely and securely send that data into the cloud We often will talk HTTP, or a more popular protocol for IoT is MQTT, which is very efficient for IoT use cases The other component of IoT Core is this Device Manager, which allows us to understand each and every device Both, like the example we were talking about before, the gateway Like, in this room, if we had a gateway of lights, we would want to understand the gateway that’s talking to us, but also each device that’s connected to that gateway would want to be known to the Device Manager So as data is being sent from these devices into the cloud, we can say, oh, that’s a Light and that’s an HVAC

And that’s an alarm Or whatever it may be Another component of IoT Core are some edge modules, where we have libraries that you can take, embed into your edge devices, that have pre-built connections to IoT Core You don’t have to use those edge components You can use the raw APIs of the IoT Core cloud modules But those components are there and extensible for you just for a quick start so you can get your IoT journey going a little faster Now, behind IoT Core is the rest of Google Cloud The first module we connect to is Pub/Sub It’s a large message bus that’s used across many Google components to allow services to interoperate Once the data gets to Pub/Sub, then it can be sent to other Google services Commonly, we see services such as data flow, that can transform data So for example, if have a data coming in from a light and data coming in from an HVAC, and they’re in different formats, we have to make them look common Dataflow will help us do that We can also call cloud functions if we want to run business logic on real time data that’s coming in in a streaming way And then many services behind that to store the data, we have multiple storage options that you can choose based on what type of data it is, how frequently it’s coming in, how frequently you want to do analytics on that data And then there are analysis tools that we have, such as Data Studio, as well as a suite of machine learning tools So this is not all the Google components, clearly, but it’s ones that we typically see on an IoT implementation You heard, potentially, yesterday in the keynote about multi cloud and hybrid cloud And a lot of the implementations we see aren’t quite as clean as every single component up here is one of these Google components Which we’d love to see, but we have many partners and there are many players in IoT So oftentimes, this architecture is a suite of tools that we see between Google components, partner components, to come together for an architecture So one way to sort of think about the various components that are involved in an IoT architecture is this simple three buckets On the left, we see ingestion to gather the data So IoT Core is there to get the data Sometimes you have data that doesn’t need an IoT broker It’s already ready to be ingested So for that, you can use something like Pub/Sub just to take the data ingested In the middle, we need to clean up that data, process it And it’s the organized step in the Google mission Here we see Dataflow functions, data storage, various components On the far right is where we have to do analysis and understand this data And if we don’t see Google components in all of these, that’s fine What’s more important is, for the folks in the room implementing this, that they have a way of laying out the components you’re going to have in your IoT and know what functions they’re going to operate on So before, when we were seeing 99% of the data in IoT is not reaching its full potential A lot of the reason why is because the process and clean step isn’t cleaning up the data as much as we need, so the data is looking like it’s in data silo from the lights in the HVAC and et cetera And the other reason is because there’s so much of this data It’s not hard for a single device to generate gigabytes of data in a single year if that device is talking multiple times per minute or multiple times per second And then multiply that times all the devices, and we’re talking about it enormous data load to be able to understand that data So we just don’t have the human capital to go through all that data And that’s where tools like– let me go back one– the machine learning tools come in that will help us understand, over here on the right, to review all that data so that– let the machine look at the data and figure out how to find the insights rather than the people And that’s why ML and IoT go together so well So on the next slide, I wanted to talk about machine learning, where machine learning also has lots of tools you could use So what’s a framework that we can think about in terms of which mL components we want to use? And this is another block of three So here on the left, we have Google components that are pre-built, ready to use Our data, our model Meaning, for example, vision We have a Google vision machine learning component that you can upload an image A pizza, here’s a pizza And Google will say, I think that’s pizza And then you can go ahead and use that data Or it’s a building, or it’s a dog That sort of thing And these components on the left, there are more and more of them all the time Very simple to use, a great place to start your ML journey with all this data But oftentimes you’ll say, OK, I know that’s a pizza,

but I run a pizza company and I wanted to know that it’s one of my pizzas, because I have all these different pizzas that I make So how can I train that model? And that’s where the middle components, the retrained models, AutoML will come in So like, AutomL Vision is a tool you can use And the models are pre-built, but the training is not And here, you’ll upload an image and say, here’s pizza, and it’s my supreme pizza And here’s another one, and it’s my, whatever, deep dish And these are the others So that Google will learn those images, so when it sees images of those in the future, maybe as the pizzas are coming out of the oven and you want to make sure it’s matching what the customer ordered, somebody could take a picture of it, match the point of sale, and say, yep, that’s the supreme pizza, just like the customer ordered with extra mushrooms And that’s where tools like AutoML can come in Any of that training you do is your data None of it would filter into the left None of it filters into Google It’s all your data And then sometimes, customers need even more They need not only custom trained models, but they need to build their own models And that’s where the data scientists defined the rules you need, defined the data sets You can create great tools like TensorFlow, either the open source or the hosted version on Google, and have models that are custom to your business And customers will often start on the left, use the out of the box ones, and work their way to the right as they need to Because the more you go to the right, the more coding you have to do And the more coding you have to do, the less you’re worrying about your core business And so that’s why we offer this suite of tools So with that, we have a baseline of IoT core and the components So now what I want to do is invite Geir up, and he’s going to describe to how Cognite has taken these tools, assembled them in an industrial IoT use case, and brought them into the real world And he’ll share with us some of the lessons they’ve learned and some of the tips for all of us to take advantage of Geir? Thank you GEIR ENGDAHL: Thank you, Adam [APPLAUSE] Hello, everyone My name is Geir I’m CTO and co-founder of Cognite, which is a software company that works with asset heavy industries What that means is, basically, industry that has big machines that cost lots of money So we’ve worked mainly with oil and gas, also in power and utilities, and shipping Now, this talk will focus on shipping because it is the most challenging edge environment that we’ve encountered But before we dive into that, I want to make sure that you’re all awake So, quick show of hands, how many of you have had a cup of coffee this morning? Yes, that’s what I thought As we all know, coffee is one of the primary or most important inputs to writing a good software code But what are the inputs to coffee? It turns out that coffee sits at the top of a very long, complex industrial value chain And you don’t have to backtrack very far to understand this, because the beans, before they came to Moscone, they were on a truck And that truck needed fuel, hence the oil and gas industry Or, where I’m from, in Norway, I believe 60% of new vehicles sold last month were electric So maybe you have power and utilities too And the truck needed to be manufactured from steel, so you have mining, steel mills And before the beans got onto the truck, they were probably on a ship So there’s this huge value chain, which touches all the industries that we are in And it’s not just coffee It’s everything that we surround ourselves with that kind of make our lives comfortable and convenient And that’s why I think it’s such a privilege to use data and algorithms to drive industry to be more efficient, produce more for less energy input, make it safer It’s really quite the potential there So when we started Cognite, we set out, and we had this kind of naive belief that we would use sensor data and would do ML on that, and things would be great Now, it turns out, industry is a little bit different from consumer in that there are so many different data silos, different protocols, different systems that you have to integrate with And in order to understand the sensor data, you need a lot of context around it So maybe you need the engineering data Maybe you need the CAD model to figure out where the sensor actually is located, or need the topology of how stuff flows through a power plant

And so we were very lucky to start out with a large industrial customer very early to be exposed to this reality of industry And I think the core problem is that the lifecycle of industrial equipment is so much longer than the lifecycle of software A plant that’s five years old is new Software that’s five years old is old So then, with this data complexity that we witness, our mission quickly became to liberate all this data from the industrial silos and move it to somewhere where it’s always available And clean it so that it becomes understandable, so that it’s easy to build value from it As you can see, it’s quite a complex reality out there As Adam was saying, the scale of data is enormous A single sensor sending one value per second will produce one gigabyte of data every two years And that doesn’t sound like a lot, but the typical ship has 5,000 of these An oil platform can have 100,000 So it becomes a lot of data, and that actually informed our choice of Google Cloud as the vendor, because we believe in the ability of Google to run distributed systems at a scale that’s one step ahead of everyone else So we want to reuse the same infrastructure that’s powering YouTube, Google search, and Gmail to store all this data For instance, we’re storing all the IoT data, all the sensor data time series, in a system called Bigtable, which is a ridiculously distributed key value store And how we structure data into Bigtable to be able to query it efficiently and store data efficiently, could fill an entire talk And in fact, it does I did a talk on that last year at Next So if you’re interested in the details of that, you can Google Cognite Next Bigtable and you’ll find that talk As someone who loves technology, it’s really easy to get kind of sidetracked into building technology for technology’s sake And that usually doesn’t lead to great outcomes in the end And I’ve been guilty of that in my past So it makes sense to sit down with the people in industry, figure out what the real valuable use cases are And some of these use cases will span– we see in every vertical, every industry that we are in, such a fuel efficiency, or energy efficiency in general Nobody wants to waste energy Energy is a cost In fact, in shipping, it is the number one cost A single percent reduction in fuel costs for an average freight ship is a $50k per ship annual savings And for a fleet of 100 ships, that’s $5 million And by the way, we’re not targeting 1% So our customers estimate anywhere from 5% to 15% savings, depending on the type of ship and the age of the ship And then there are other use cases, which are specific to each industry, such as weather impact on cargo You want to be able to tell your customer that you didn’t damage the goods And you can measure the motion of the ship at any time So you can kind of say, hey, look I didn’t go through too rough seas This is what a data silo looks like We had this great piece of equipment, a sensor, that is using lasers to measure the torque on the propeller axle on the ship So it’s like the laser underwater sending a beam along the axle It’s reflected It’s detected It measures the microscopic deformation of the axle due to the force enacted on it Now, contrast that with the way that this data is sent to the cloud Because what happens is, every day, this guy walks down to the engine control room and writes down the value on a piece of paper, and it goes into a report But that’s not– the reason this particular measure is

important is, if you want to solve the energy efficiency use case for a ship, you need to be able to break down the fuel used to ship motion into its two main parts That’s the engine efficiency So how much force do you get onto the water via the propeller for each unit of fuel used, and the hull performance So for the force that the propeller acts on the water, how far do you go? And a once per day resolution is nowhere near what you need to solve the use case Because the engine settings change too often Weather changes all the time So here, to get the data out, we had to install an adapter And this is a pretty common pattern So the data set’s in a separate network You can’t talk to it on the ship’s network, which converts the data and sends it onto the ship ethernet, where it can be picked up by an edge box And you have many of these systems that you need to liberate data from Here’s another issue Connectivity The good news is that you can be connected virtually anywhere on the planet The bad news, it’s kind of expensive and the bandwidth is not great One of the things that surprised me about this is that you actually get quite decent latency Your data can go from the laser to the cloud in 200 milliseconds I find that pretty awesome But we have too much data to send over this connection And most of the data is not that useful for this use case anyway So given those constraints, here’s how we approached solving the situation First, given the bandwidth constraints, there is a need for an edge component It needs to manage the bandwidth So some of the data is high priority and should be sent directly to the cloud Some of the data needs to be buffered on the edge, stored there until connectivity is better When the ship is close to shore, it will get cellular coverage, which is much cheaper and much wider when you can bulk upload that data You also may want to do some processing on the data Down sampling or compressing the data in other ways For instance, vibration sensors on rotating equipment will often send you data at 50 kilohertz So that’s 50,000 data points per second That’s like, 10 times more than the rest of the sensors on the ship You can’t send all of that data And you don’t need to Because if you do a Fourier transform, convert the signal into the frequency space, and send the coefficients of each frequency that constitutes the signal, then you get almost exactly the same curve with 1/1000 of the data transmitted, which is also a very cool stat And you may also want to use it as a cache If you have applications on the ship, if you have like, a digital worker application running on Android phones where the workers on the ship can get all the sensor data from the ship at any time, they may not want to go via the cloud for that So you can ask the edge box directly And there are also more interesting use cases which we will get into, which does more interesting computes and predictions on the edge Also, this is not your typical IoT consumer scenario where you have to have very, very light hardware It actually makes sense to invest in rugged hardware that won’t break, in this case And it also needs a bit of storage because it needs to buffer all this data An average ship will produce something like 500 megabytes per day of data And it can be at sea for up to two months at a time So you need some storage there and You also don’t want to lose that data if one disk dies So it’s a RAID configuration It has a rugged PC with no moving parts So here’s what the architecture looks like You have all these silos at the bottom, the green boxes They’re typically on their own networks

You can’t access them via the ship network So you need a physical adapter And in our case, those are always set up to be read only, which makes the whole situation less scary from a security point of view Then it’s picked up by the edge box And what it needs to do is translate from all of these different industrial protocols Here we have the modbus, the OPC, OPC-UA, MQTT, and a wide variety of other dialects that you need to talk Once you’ve translated the data into a sane format, put it on another queue, and then it’s processed It’s matched against the list of high priority [? tax. ?] And some is sent directly over the satellite link, which is managed by Cloud IoT Core And the other data is stored and sent over a cell phone connection One of the things that we didn’t do, which we are considering doing, is to move to EdgeX EdgeX is a framework for edge solutions, which will do some of the translation for you And it has a lot of components that you can put together to solve for many of these scenarios One of the things– so Cognite is almost 200 people, and it’s just over two years old One of the things that we value very highly is speed And one of the things that we do to get speed is to always try to use managed solutions where we can You don’t want to reinvent the wheel And when you’re looking at this next slide, you might wonder if we do anything at all Because it looks like Google is doing all the work But there is some stuff there Like, Cloud Functions, we actually write the stuff that’s in the Cloud Function, and the Kubernetes engine, where we have our business logic So on the cloud side, basically, what IoT Core gives us is a managed connection We don’t need to scale that thing And it has encryption Authentication is taken care of And it conveniently puts the data on the Pub/Sub queue, which we use for the rest of our system So it’s, to us, it’s– we just interact with the Pub/Sub queue Which triggers Cloud Function, which translates the data on the queue into API culls to our API gateway So our solution is used by many customers in many verticals, many industries, which have slightly different ways of sending data So not all of it is going through IoT Core And that’s why the API kind of accessed the one gate, which is the common denominator there That’s why you need that translation Same thing with the daily logs, the backfilled data It’s uploaded to GCP– Google Cloud Storage, I mean And that triggers a Cloud Function whenever a new file is uploaded Which does the same translation So it’s the same protocol buffer-based format Now when we get to the yellow box, we have a pipeline for processing the data, structuring it so that it doesn’t use a lot of space, and computing roll ups so that you can do advanced queries with millisecond latency And this pipeline, we’ve run at 10 million data points per second So every component there, like Kubernetes engine, AutoScaler, Pub/Sub, scales incredibly well, and Bigtable So it all scales horizontally And it’s very fast And then when we do run predictions on this data, we use ML Engine to host those models And we use Cloud Scheduler to do periodic predictions So we don’t write back to the control systems So the use cases that we have are about advising, say, the captain You know, you need to slow down a bit because you’re using too much fuel Or you need to clean the hull, because there’s marine growth, which is slowing you down It’s something that you can do periodically Maybe you do a prediction every minute But we do– you can think of cases where you’d want to do streaming, analytics, when it comes in But right now, it’s not necessary

Security So the edge environment is kind of scary The control systems that you deal with there are set up that they usually don’t have the concept of authentication So if you can talk to them, you can do whatever you want with them A couple of principles that we imply, we don’t want to invent their own encryption scheme We let Google IoT Core handle that We have no inbound routes to the edge And there’s a physically separated network So we only read to these sensors So that makes it a lot less scary to be on the edge But security is definitely a big concern here So bringing that together Here we have one of the dashboards, which will show you and verify that the cargo was not damaged This is the ship’s motion in all the different directions, axles, at 10 Hertz with 200 milliseconds latency So if you’re shipping cars, for instance, you could share this with the car manufacturer and really assert that you didn’t damage any of the goods It’s also very important input to the energy efficiency use case Because if you have rough seas, your engine will have to work more In fact, very rough seas can make the engines shut down entirely, which is what happened with the cruise ship, Viking Sky, a couple of weeks ago outside the coast of Norway We had rough seas and the engines tripped as a result of that, leaving the ship in a very precarious situation And then when we move on to the more advanced use cases that involve machine learning, you have an, of course, anomaly detection, which is very much used in predictive analytics The models there are usually based on clustering or forecasting So you’re looking for data that you haven’t seen before, basically And then you’re alerting someone Inspection We have a lot of sensors that are not just pressure, temperature, flow We have a sensor in our pocket, which is a camera And you can use that to extract loads of interesting information using image recognition So inspection, and we’ve trained– for instance, last year, we trained an auto ML model which detected damaged wires And you can imagine having drones to fly around and detect these faulty things, or corrosion, which is already being done by some companies And in those cases, you definitely do not want to send all the footage to the cloud if you were on a satellite connection DANIELSON: [INAUDIBLE] Yeah So, Geir, I heard some rumors about your demo last year, that you did some– had the windmill on stage and did some machine learning as well? That must have been pretty terrifying GEIR ENGDAHL: Yes A good demo is exciting to the audience, and very scary to the presenter So I think, especially when you have moving parts and you’re doing machine learning– which is never really 100%– that’s always scary But you know, the one trick is, you can always blame the Wi-Fi [LAUGHTER] But yeah, so I think we should we should try to top that and do something even scarier this year STEIN H. DANIELSON: Yeah, definitely And that’s why we brought this tiny model of a chemical plant It continuously streams live sensor data to Cognite Data Fusion via Google IoT Core So typically, you use this sensor data to visualize it in charts like you can see soon on this screen So here you can see the temperature, it’s about 27 degrees That’s pretty hot But there are other ways to visualize your data, as well In Cognite, we see the power of using 3D models as a tool to visualize your data And that is definitely, I use a tool for the future GEIR ENGDAHL: I see that But what if you don’t have a 3D model? STEIN H. DANIELSON: That’s definitely

a big issue and a challenge for the industry Because either you don’t have a 3D model, or perhaps the 3D model already got old from the day of assembly and has become kind of useless So that’s why we have developed an application that helps you generate these up-to-date 3D models in order to visualize your data in a better way GEIR ENGDAHL: You’re not connected, I think STEIN H. DANIELSON: OK GEIR ENGDAHL: It’s the Wi-Fi STEIN H. DANIELSON: Yeah, definitely the Wi-Fi So what we can see here is the model on screen I take a single picture and I detect features on it I could take one more, and then it guides me through the entire process So this is something that I do now on my phone, but this is something that you can do with any kind of camera And it’s not something that you have to do yourself, either GEIR ENGDAHL: Yeah So you just need the photos for this to work, you don’t need a special camera You might wonder why there’s a helipad here That’s because, sometimes when we run this, we use a drone Unfortunately, if you look closely on the sign outside the door here, it says no drones are allowed And the complexity of getting that permission in time was unfortunately too high for us But it doesn’t matter, you can use anything– like a mobile phone, or a drone, or a GoPro on someone’s helmet It doesn’t really matter STEIN H. DANIELSON: Yeah GEIR ENGDAHL: So what’s happening to those pictures right now? STEIN H. DANIELSON: Yeah Now that I’m finished taking the pictures, I just upload them to Cognite Data Fusion, where the magic happens And right now, it’s processing all the pictures And should we take a look at the final results? GEIR ENGDAHL: All right Let’s try that Now I’m scared STEIN H. DANIELSON: If not, the Wi-Fi has failed GEIR ENGDAHL: Ooh, ooh! Yes And here it looks like we got a nice model STEIN H. DANIELSON: So suddenly we have a 3D model that’s up-to-date with exact same dimension, the exact same proportion And also, you get a good overview of the texture on the model, as well GEIR ENGDAHL: But if you want to view the time series in the right context, here, I guess you have to do some kind of manual– STEIN H. DANIELSON: Yeah Yeah So the thing is, that this is definitely a really good tool for the future of the industry But the thing is that, wouldn’t it be cool if we could link this model with the data that we have inside the Cognite Data Fusion platform? And that’s actually something that we can do So if we take a close look on the tags that are located all around this model, you can see T-02, T-01 Those kinds of tags is something that is present on all platforms, or in ships, as well, and identify the different assets That identifier is something that we also have in Cognite Data Fusion So what we can do, is to use the same pictures that we took here and try to detect all the different tags using Google Vision And when we have detected them, we store the location of all the different assets, and also which asset it is GEIR ENGDAHL: So we’ll see if it got them in the right– STEIN H. DANIELSON: And the end result is here [APPLAUSE] GEIR ENGDAHL: So what was happening behind the scene there, is we used a model called YOLO, You Only Look Once, which is an object detection model And that thing is trained to look for these– STEIN H. DANIELSON: Tags there GEIR ENGDAHL: The tags But it can identify other stuff too, like rust, or faulty wires, et cetera, and place it right there in the 3D model That’s instant situational awareness, there This room is too hot STEIN H. DANIELSON: So we can demonstrate this for you Let’s see if I’m hotter or cooler than 28.2 degrees So I’m cooler GEIR ENGDAHL: Yeah STEIN H. DANIELSON: I’m almost dead GEIR ENGDAHL: I guess maybe you have some cold feet, too? STEIN H. DANIELSON: Yeah But then we’re inside the operational limit again,

and that’s good Because it turns red but it’s above 27 GEIR ENGDAHL: Yeah You’re actually cooling the room down STEIN H. DANIELSON: Yeah So, yeah And this a model that you can update 24/7, at least when you have a robot to take the pictures for you And that’s really convenient when it comes to detect when corrosion started You can actually travel back in time, you can check whether the corrosion mark was there only two weeks ago, or perhaps two years ago So to conclude, five minutes ago, we only had a chemical plant that streamed live sensor data But now, we also have an up-to-date 3D model that displays our sensor values in a really intuitive way Thank you [APPLAUSE] GEIR ENGDAHL: And you can do this with real industrial equipment, too This is data now streaming in from the North Sea, and it identified the plates on that equipment, too And also, here, you can find some things that shouldn’t be here For instance, these ladders They’re not supposed to be in this zone, so it’s a health safety environment issue, right there Then let’s wrap up Here’s my checklist I wish this checklist was shorter, but it’s not This is the reality of why it’s hard to create value in industry So you have to start with the valuable use cases Start with that, and talk to the people in the business because you will depend on them to implement it later So it better be something that they think is valuable, too Then, you need to figure out if you have the data that’s necessary to solve those use cases Usually, you have the sensors, but you may have to do some work to liberate the data And then you need to clean the data, make it so that you can actually solve your use cases For instance, linking it to a 3D model It may seem trivial when you have, kind of, a plant with four sensors But if you have 100,000 sensors, and you send the data scientist in to find the right sensors to do predictive maintenance on your one compressor or whatever, it’s going to take months if you don’t have good tools to figure out what sensor is where, and which, and relevant to the use case Then you need to do stuff with the data, so that’s algorithms Usually, you don’t build your own You merely apply other people’s work Most companies don’t do machine learning research, they do application of known research to new problems Now once you’re satisfied with those results, you need to monitor the quality of the data Because your mobile probably won’t handle garbage data coming in And these systems, these sensors, will give you a lot of crap data from time to time There will be connectivity issues There will be flipping bits here and there It’s a quite challenging thing to do, so have to build that trust in the data And then you have to change the way that people work in the company And there, it’s very useful if you’ve kind of introduced your use case early on, so they know this is not something you made up Worst case, you’ve done a lot of work, and you get to this stage, and it’s like, hey, this isn’t useful That sucks And then last, that’s when you’ve done all those things You profit And that’s what IoT is really about, otherwise it’s not sustainable Here are some pointers to get you started So for IoT Core, all the AI solutions that Google has to offer, we have a booth down in the lower level and there are industry solutions So it’s on the left after taking the escalator down Come talk to us there if you have questions Thank you And also, don’t forget to rate this talk We definitely appreciate your feedback so that, you know, next year we can have an even better talk and even scarier demo Thank you [APPLAUSE] [MUSIC PLAYING]