welcome everyone to today’s webinar easily integrate data with Salesforce lightning connect this is Marla’s furs and I’ll be your host for today’s session beginning today’s session is Matthew Monahan progresses principal product manager for DataDirect cloud Matthew has nearly 20 years of experience in building and operating software as a service offerings and join progress in 2014 also presenting today is Jason Choi the vice president of product management at Salesforce Jason is responsible for integration products including data loading canvas Enterprise api’s and the latest edition lightning connect joining Matthew adjacent to facilitate today’s Q&A discussion is Greg stasko our business development executive at progress DataDirect I would like to introduce Matthew Monaghan to kick today’s session off welcome Matthew Thank You Marlys well I’m really excited to talk about what we have today both for data direct cloud and for Salesforce and as Marlys mentioned Jason Choi will be talking a little bit about lightning connect today so what we’re going to do I’ll bring up the agenda here we’re going to kick it off I’m going to talk just briefly a little bit about progress software and data direct and then I’ll hand it over to Jason for an introduction to lightning connect following that I will talk a little bit about data direct cloud and our integration with Salesforce lightning connect and then provide a demonstration of data to our cloud being able to connect to back-end data and bring that in through odata into Salesforce via their lightning connect integration as Marlys mentioned we’ll finish up with QA let me just tell you a little bit about progress software and of course during the course of this presentation while our emphasis will be on current functionality and a demonstration of what our product can do today we may discuss things that are forward-looking and just want to be clear that the readers advise that for any forward-looking statements about functionality that’s in progress or will be real least in the future those plans are always subject to change and you should not make any purchase decisions of either products or a company stock based on that information so progress software has been around for over 30 years now and the data direct business as part of that has a focus on connectivity to a variety of different database systems from relational systems to no sequel new sequel big data systems as well as connectivity into software as a service application data and some of the transformations that happen for those products so we offer quite a range of data connectivity and integration we’ve been doing this for for a while and we’re very happy to be working very closely with Salesforce to integrate our technology the things that we’ve been using for years and bring some of those capabilities into the Salesforce environment so what I’d like to do now is turn things over to Jason Choi and let him tell you a little bit more about lightning connect thank you Matt field so today we get more and more customers interesting using Salesforce as their fun office to write it would like using Salesforce as a system of engagement to connect with their customers at the same time as many of the enterprise and transformers they still have their transactional financial operational data store in the back office systems and that means that it was a huge need for integration how they would like to be able to see the data storing in the backup system in Salesforce the traditional approach is using as you can see on the screen here different kinds of data loading capability weather is using a menu file loading or using FTP process or just some of the point youporn processes or even to the extent of using enterprise message bus for loading dead in the Salesforce it works for some of the use cases being in certain use cases it presents some challenges one is that when you copy a data extra copy of data into Salesforce that means the second that you finish copying the data would become stale because your master will change down in the backup systems that we need to maintain a synchronization process right you’re keeping the data you think syncing data is in megabytes or gigabytes public easy but when your data set is in paradise or terabytes size right sinking the data becoming a virginal challenge then the other one is that the process and complex the building this time for integration solution you will be able to to do it let me know for four simple jobs but when you have the data modeling or different kinds like metadata schema we need to maintain you require a

special talents excuse eps people you also takes time to configure it as communication capabilities and also that would translate into money so that’s why a Salesforce are we looking at all these challenges and we come up with a new solution called lightning Penang so the blinding connect is just another integration product that we introduced a is a yeast you use meaning that the setup can be using Mouse simply Mouse one and click to have the whole integration cavities being set up data is being accessed in real time what a transient your business value is that your business users using Salesforce will always have access to the real-time data is being so in your in your backup systems so they will never have the Dow’s that the data they’re seeing on salesforce screen is different or outdated from the master then last but not least because it’s a new integration approach they open up the door for our customers to be more innovative in terms of how they would be able to access and and using the data right you for the reason is whether it’s accessing it in their social applications i using chatter or not or for mobile devices they open up by antonio opportunities for our customers to be able to access the back office data so let’s take a quick look to see how is how is this lighting connect work so for some of you that are familiar with the interpret industrial trend lighting connect is just similar to what was commonly referred to as data virtualization or that it data saturation approach what that means is that with lightning connect the salesforce server will be making an outbound web services call over the internet to which if the data in real time from the data source the data source could be a sequel server could be all call SI p the basics be achieving real-time over the internet and when we do achieve the data we’re not going to keep a copy of it the data is only being blue chip on demand basis meaning that we are not going to copy the entire data set we only be accruing the data sets for the specific needs so if you want to pull the data for Jason then you will only point that 20 rows of data for Jason’s that by copying the entire intro database into your Salesforce and when we potentate into Salesforce we also not making any copy of it the data will not be stored anywhere in our database or file system the data will only exist in our server memory doing the the processing and the data when they’re sitting in our memory we actually take one more step is to format the data into Salesforce object representation what they translate into is that that your data sitting outside as a external system but now present you your Salesforce user just as if it’s another Salesforce object so that many of the high level sales force operation can be applied directly to the external objects so that you can run Korea’s you can run search can build visualforce pages or codes against it and last but not least you know that you have a open system we adopt a open standard koro data protocol for FDA the wire protocol for us to access the data because we believe that using a open sender will allow us to have a much healthier ecosystem allowing our partners and customers to find real solutions easily so let’s take a look to see why why because here the lightning connect and data that are talking can come hand-in-hand so in order for lightning connect to work that means Salesforce server need to be able to access your data importance for the public internet and today we don’t have any way to penetrate the firewall but most of our customers are putting their data source in the backup system on-premises behind firewall that’s except where one of the area where DataDirect cloud we will be able to come to rescue the second thing is that as I mentioned earlier in order for lightning connect to work you rely on the open data protocol data protocol for the communication and for many of the resources they all they might have a interface for all data that will work but it’s for the same token right there’s so many other data sources that they do not have audit interface that’s another area where DataDirect cloud will be able to pull why Odin in the face for those database systems allowing lightning connect to be able to access those data sources easily from the social side so Matthew back to you so that you will be able to tell the audience how it carries the director will be able to come out in this cases thank you Jason really appreciate that introduction to lightning connect and I hope so far what what our audience has picked up is that this really is a new paradigm for data integration between back-office data and Salesforce as a system of engagement and it’s extremely powerful and as Jason mentioned really requires o data to be able to leverage that full capability and so I’m going to tell you about today is dad direct cloud a little bit about how we manage data

sources that we connect to expose them via o data for for use in Salesforce so a little bit about data direct cloud our philosophy at a progress has always been around standards and performance and we offer an on-premise connector which enables you to securely traverse the firewall with your data so what that means is that for your systems of record whether it’s a relational database like an Oracle or Microsoft sequel server or IBM db2 or a number of others that’s it behind the corporate firewall we can provide our on-premise connector rather than having to expose o data on your network and opening up a firewall hole so that Salesforce can connect in instead we connect out to our cloud environment and provide a secure mechanism for moving that data around the second thing is around standards-based connectivity we do provide odbc and jdbc drivers for our cloud environment but most important we provide that Oh data translation so as we get into the demonstrate you’ll see the range of data sources that we support in our cloud environment today and any of those data sources can be exposed via the OU data protocol and thereby consumed by the Salesforce lightning connect capability and then finally we provide access to software as a service or cloud data via the API is that those providers already offer and what we do is we pull those API is in provide some structure around it so that we can expose it whether it’s bio data which has some very specific schema requirements or ODBC or JDBC depending on your specific use so I want to talk about a specific use case that we have with one of our customers and this is a large financial firm that currently has 11,000 Salesforce users and these users are accessing data through sales force as their system of engagement and some of them have been using Salesforce for a number of purposes and use cases some of them are new into Salesforce but they all need access to a variety of data that is stored in the legacy system of record and a siebel system and so this large financial firm was looking to mitigate some of the business risks that Jason was talking about earlier with some of the traditional ETL solutions as well as the the risk of doing such a massive system change from one system to another with 11,000 users you want to make sure that they are up and running productive and able to access the data that they need without any interruption in that flow some of the specific requirements that they had first were to keep their system of record inside the firewall meaning that they didn’t want to expose some of this critical data out in any kind of cloud system it needed to be accessible there but not stored there the second was providing real-time access and as Jason mentioned one of the challenges with some of the traditional ETL approaches are that the moment you move data from one location to another it’s out of date there’s always the potential that changes have occurred potentially even during the ETL process which can create some challenges and keeping that data current and then third performance and high availability we’re two extremely important considerations 11,000 users and if you think about the the value of what those individuals are doing and these are mostly sales and support individuals as well as the cost of them not being able to access their system it’s significant if data takes minutes to load that’s too long so what were some of the other things that that this firm had considered well of course they looked into the usual third-party ETL tools and the capability of storing data and custom objects within Salesforce we’ve talked about some of the challenges there the second was potentially providing a third-party application somewhere in a you know an outside application that would pull the data and make that available then you get into the swivel chair now you’ve got more systems to maintain there’s there’s challenges there as well so the third option was writing custom code within the Salesforce environment you know generally something like apex or some other code within Salesforce that would then tunnel back into the corporate network and pull that data in in real time but there’s an enormous amount of complexity and cost that goes into that and when this organization took a look at at these usual options and then found out about lightning connect I get really

excited about the ability to have that real-time connectivity into the data keeping it on their network but the two things that they needed to accomplish we’re one getting through the firewall and to providing Oh data so their Oracle database that holds the information from siebel it would have been a significant amount of work for them to set up some kind of o data server provide that connectivity into their Oracle database anytime they made changes it would have required potentially more code work there was a lot of complexity that they didn’t want to tackle and so they ended up going with lightning connect and data direct cloud so we provide that last mile connectivity between our cloud environment and the Oracle database we have a Oracle driver that we provide and it’s a wire protocol driver it is extremely high performance and we provide that inside of our on-premise connector which can then communicate out from the corporate network into our cloud environment in a very secure way encrypted traffic everything highly reviewed for security as well as before performance so that the return communication from Salesforce comes through our cloud and and gets into the database in a very secure manner not only was this a a great solution for this financial firm in terms of what their use case was today but it also made a very easy path for them to add additional data lid later so of course you know like many organizations they’ve got things they’re exploring around data lakes and some of the no sequel databases and newer technologies as well as some of their other legacy systems for example Microsoft SharePoint others that are also sitting behind the firewall and so this is a very short path for them to now extend this solution in a way that enables them to pull in additional data without any additional configuration on the Salesforce side no additional integration work required one of the things that that was really appealing to the folks at this firm was that it was really a clicks not code solution it didn’t require a massive IT project it didn’t require weeks or months of expensive resources doing development work that would then have to be maintained going forward it was really a point-and-click solution within the Salesforce environment as well as within data dark cloud and what we’re gonna do now in this second half here is actually walk through demo so you can see what I’m talking about and understand how simple it really is to establish this connectivity and bring it into Salesforce so I’m going to switch over to my browser and look around the room here for a knot or two that my browser is showing make sure that i am still logged in here so this is DataDirect cloud data cloud is a combination of a web interface and a set of AP is everything that I’ll show you today through the user interface is also available through a restful web interface so it’s it’s an API that allows you to configure data sources access your data all of that is available via API the dashboard and the and the web UI just makes it a little bit easier to get started so what I’m going to do is I’m going to go into data sources I’m just going to take a quick pause here to click on data stores to give you an idea of the data sources that we support today and you’ll see that there is a range of databases from traditional relational databases to some of the newer ones some of the big data stores some of the software as a service applications that are out there and we are constantly adding to this list in fact there are a number in our beta lists that are going to move up into GA in the coming weeks here so this is a constantly evolving list and we’re always looking for our customers to provide feedback on the data that’s most important to them and how they’d like to see that connected so let me do them take a look into one of my data sources so this one just to give you an idea on the information that needs to be entered to make a connection for the first time it’s all on this first page here so we have a data source name so this is within the data direct cloud environment I’ve set a name to reference this specific data source I provided a description just so that I understand where it is and this is a simple RDS

database that I’ve set up it’s based on Microsoft sequel server hosted at amazon and i just set that up for demo purposes to show how that’s connected I’ve entered my database username and password I have specified the server name so that’s the name by which my database instance is accessible as well as the standard port number and then within that database server I’ve identified the specific database that I’m connected to and if you’re familiar with microsoft sequel server adventureworks is a common sample set of data so you may be familiar with that as we go through this this demonstration so so far that’s the most basic information that needs to be put in and if i were to test connection this will go out and make sure that the connectivity between data Dirk cloud and the database has been established successfully with the username and password are valid and if I go back to my data sources page I select that adventureworks database and i click on sequel testing what i can demonstrate from the UI here is that within this database i can take a look at all the different schemas that are available i can select one such as sales and i can expand this you see that i’m already connected to the database i can take a look into my database structure i could see the schemas the tables the columns and if i wanted to i could actually execute a query here such as select all from say sales customer i could execute that and i can bring that data back into data direct cloud great so I’ve established connectivity but what I really want to do is I want to bring this data into Salesforce so how do I do that so I’m going to go back to my data source i’m going to edit this one and i’m going to go over to the o data tab and i’m going to configure a schema so to understand a little bit about what we do we connect your relational databases alright so so things that you’re very comfortable with in terms of sequel queries relational structure tables rows and columns but we also connect to a pis which might be objects and attributes and has a very different structure to them no matter what we connect to we can expose all of that in a very organized relational kind of way around tables or objects and rows and columns and some of that is required for Oh data because o data is a REST API but it’s also one with a defined schema so it has that structure around the traditional HTTP get post put calls that you have for a REST API but it’s got the structure that’s necessary for something like Salesforce to read and understand those api’s programmatically and so what I’m going to do by clicking on scheme configure schema let me go back and I’m going to select the sales schema again and these are my tables that are available within the sales schema in my database and what I’m going to do is I’m going to say you know what I want to take a look at sales information so my sales information in my back office system I want to bring into Salesforce so that my sales reps or my customer service reps can access that information when they’re on the phone or exchanging emails with a customer or prospect and so I’m going to expand this that you can get a sense for what this is here and you’ll see that within the sales order header I have a number of fields right all that information is is available you can see that data dark cloud has identified from the database structure that the sales order ID is the primary key and you can see that one of the things I have in here is customer ID well of course I’m going to need to relate this particular you know a particular order record that’s stored in this table to some kind of individual or contact record in Salesforce so the customer ID that’s going to be useful but you know what I don’t have customer ID I have maybe I have a count number stored in Salesforce well I want sales order header but let me also go into my customer record where I have the customer ID as the primary key oh there we go there’s account number and maybe account number is the the item that I need to relate in sales for so I’m also going to check off customer so I now have customer and sales order header selected and I’m going to save this and what you’ll see here is that the schema

map that’s been generated based on the UI that I’ve used I remember everything is API accessible so I could just do an API call that identifies this string for the schema if I wanted to includes within the sales schema it includes sales order header and customer so those are the two now objects that will be exposed via o data now if I wanted to do some testing I’m not going to go into too much around some of the testing tools but I could copy this service URL and I could paste it into a tool such as postman for example and let me actually save this here click update and within a tool like postman I can send that same URL up here that service URL and I could submit that with an HTTPS get and it will come back and tell me customers and sales order headers are the two entities that are available in this Oh data endpoint so what have I done so far in the past say five or ten minutes I’ve established connectivity to my database I have done an o data mapping which has exposed the data via and Odetta endpoint and now what I want to do is I want to bring this data into Salesforce so i’m using a salesforce developer account right here I don’t believe Jason mentioned so I’ll just throw it out there if you create a salesforce developer account you will with that developer account automatically have the ability to create external data sources so I’m going to walk you through that process if you have a production account that you want to do this in you can request through your Salesforce rep that a sandbox be configured with with access to Salesforce lightning connect and so you can you can test this out in your in your primary production org within Salesforce so I’m in the setup tab alright so you click on I’m sure many of you if you’re familiar with Salesforce may already know about this we click on setup and I’m going to scroll down to develop under the build section expand that and I’m going to go to external data sources as Jason mentioned earlier an external data source is very similar to a custom object or any of the native objects within Salesforce and I’m going to click on new external data source and I’m going to give this some configuration so again no code this is all so far just point and click web interface type stuff that I’m doing and let’s call this say sales orders and then I’m going to select lightning connect as the type of external data and I’m going to paste in that URL that is generate again DataDirect cloud com that’s the URL that we provide the Odetta endpoint under format I’m going to switch it over to JSON we do support either one what we found is that JSON seems to be a slightly better performance that’s our recommendation and then under authentication I’m going to select named principal named principal authentication means that within this Salesforce environment I’m going to specify one username and password to bring that external data into my Salesforce environment and then within my Salesforce account any of my users will have access to that data unless constrained within you know sales forces permission system there’s also the option to do per user which would mean that each individual user would have to provide their own data source username and password so under named principal I’m going to switch this over to password authentication and i’m going to enter my data direct cloud username and password not the database username password the data direct cloud information this will give me access to any of the the data that I’ve exposed via o data for data direct cloud regardless of the underlying data source so go ahead and hit save so i have now configured the external data source and now what I need to do is actually connect to it so by clicking on the validate and sync button that will tell Salesforce go connect to the URL defined below and see what’s out there and what it’s done is it’s come back and very similar to that that test and postman that I showed you earlier I see the two entities and if I go ahead

and check the box here I can select both entities or objects or tables depending on the lingo you want to use and I’ll click on sync and Salesforce is now going out and reading the schema from DataDirect cloud that was built by data direct cloud by reading the sequel server database and if I scroll down you’ll see that there are now two external objects related to this external data source and if I click on one of them say for example sales order headers and if I scroll down a bit what you will see here is that all of the fields that were accessible invisible in my graphical sequel editor with into a dark cloud they would be the same within the database you’ll see that all the way through from the database all the way into Salesforce we have maintained information not just about the actual objects but all of the fields within the object or columns within the table within the database and we’ve maintained the data types so this information that’s available in that database is now available within my Salesforce environment so I can do whatever I want within the Salesforce environment now to to use this information and whenever a record within this external object is accessed it is in real time so to pull up a sales order record within Salesforce is going to generate the query Salesforce will via Oh data send it to data direct ploughed where we will take that Oh data request translate it into the appropriate sequel statement to access the data out of Microsoft sequel server so that is the end to end demonstration to show the ability to pull data from data cloud through Oh data into Salesforce lightning connect I think I’ve kind of been rambling on here for about 20 minutes so you can see that in under 20 minutes we’re able to provide all of that connectivity all would just point and click so depending on what information you have if I just go back to data direct cloud and again show you the data source if you already know the database you’re connecting to the information you want to bring in your username and password for the database today right now you can go out to data direct cloud com you can sign up for a free 30-day evaluation and you can actually test this app it’s available today it’s been available for for quite a while now and at this point i think what i’m going to do is take a breath for a moment i’m going to hand things over here to my colleague greg stasko we’re going to give him an opportunity to to take a look at any questions that have come in so far i will mention a reminder that probably somewhere on the right-hand side of your screen there should be a GoToWebinar panel that will allow you to ask any questions that you have and we will we will read through those answer as many of those as we can again my name is Matthew Monahan it’s been a pleasure talking with all of you today Jason I really appreciate you taking the time and of course anyone has questions for Jason we can we can point those adjacent as well and take the next 15 or 20 minutes to to go through any questions that you folks have Greg Thank You Matthew and Jason we do have some good questions that have come in the first one is there any special database permissions or privileges that I have to have 20 data enabled my data source great question so the only permissions that you need at the database level are ones that allow you to read and access the data that you want to bring in to data dark cloud and through data direct cloud and to lightning connect so there are a couple of ways to do this one is if it’s a I’ll say publicly accessible data source meaning maybe it’s out on the web or it’s it’s available through your firewall today you just need to put those credentials into data direct cloud and then you can you can access those if it is behind a firewall you will need to install the on-premise connector and I’m just going to jump back to the demo here briefly and show you that under the Quick Links on the left hand side there is a downloads link you click on that and then the bottom right here you will see the data direct cloud on-premise connector installer there’s a 32 and a 64 bit version available for Windows today and so you would download either

of those as appropriate if you want to do a basic test and you’re running a Windows machine on your corporate network that has access to the database you could just install that on your desktop computer or laptop and and give it a test that way if you are if you don’t have access that from your desktop if it’s in a segmented part of the network then the on-premise connector would need to be installed in that Network segment so it has access to the database the only thing you need for that installer is your data dark cloud user name and password so of course if you get to this download page you’ve already got your your data direct cloud username and password because it sits behind the login so that information is configured when you install the on-premise connector the on-premise connector is designed to be very secure not only in the data that it transmits but in the way it works so you must put in a username and password when you install your on-premise connector and that one can only be used with your account so there’s there’s no way for any other account within dad direct cloud ever to access your on-premise connector so hopefully that answered the question yes thank you another question just come in are there limits to how frequently i can connect into the volume of data that i retreat so there are and i’m just going to talk for a moment but at jason feel free to jump in I I believe that for for lightning connect there are some limitations on the volume of calls that are separate from the other api’s so it does not fit into the overall API bucket that you have today there are some limitations around you know / 0 data endpoint I think it’s up to 100 objects or tables Jason out if there’s anything else you want to add there that I’m missing Thank You Matthew well so you have to rewrite for the access to the external object it is the data is being stored our selves Salesforce so it’s maintaining a different set of limits or quarters different than the api’s so currently we have a limit more or less for you can call the service protection this is about 50,000 scores per hour so meaning that on an hourly basis you can make up to 50,000 out bomb old data queries against the external data system however as I mentioned this is just more or less a soft limit for service protections that means that if you you have the need to go above that limit all you need to do is to submit a request for the Salesforce wrap then we can raise the limit your higher higher ceiling so so that’s on the rate limits I excel Eli metal was saying there’s also a limit in terms of how many journal objects you can have on a per Salesforce off perspective so you can link the external objects to database tables today we have up to 100 objects to Salesforce all that means that you can have once you establish the connectivity to external database you can have you can have up to 100 external objects in that particular Salesforce all right map to the database tables we planning on the south side to to increase that limit that’s currently is a technical limitation but we’re planning to lift that in the future releases hopefully that will give you a lot more than just 100 thank you Jason and and on the DataDirect cloud side as well we we currently limit based on the volume of data that passes through so depending on the particular plan type there are a number of different limits as to how much data can be moved through data to cloud before you become open to to data overage charges alright thank you very much another question a user is asking can we update external object records through Salesforce and then right back to the on-premise data source great question so today we do not have on either the sales for side or the data der cloud side right available i should say that on the salesforce side right capability is currently in a limited preview so if you request access to that through your salesforce rep that’s something that can be enabled Salesforce is working on releasing that into GA later this year for data direct cloud we are also working on right support and plan to support that as soon as sales force goes GA with that capability thank you for that another question comes into specific two versions of databases that are supported the question is asked if

my Oracle version is not current am I still able to enable the with 0 data that is a great question and we make we go to extended lengths to make sure that prior versions of the databases that we support our super supported to the greatest extent possible I don’t have that that list in front of me but we can certainly provide a list of the specific database versions that are accessible via DataDirect cloud and our and our connectivity excellent are you able to use external object data to trigger work flows from the Salesforce side i’m going to let Jason handle on because i’m actually not sure sure that’s actually a quick question right so the nature of external objects i think the biggest difference between external object is extended or customer sales force object is that the data is being stored outside of sales force which mean also means that there isn’t any when the data change in your your master somewhere outside of cells was there isn’t a way for us to find out whether the data can change that step recently challenge in terms of implementing the triggers and workflows however when we go into also have the support for all data for that low standard where it has a call back hope right for the external system to notify us using this special hole then we would be able to implement triggers and workflows so long story short today when we only have all digital connectors workflow and checkers is not supported for the diamond connection objects but we are currently working on the OData for the old connectors we planning to have that ready to be released by the end of the year when we have that and would be able to support our triggers and workflows thank you Jason on a related question following up on that what is the ability in terms of external data for use with either Salesforce reporting or Salesforce wave analytics so so far we post ASAP following the same category the nature of reploid said the data is sitting outside it will be pouring small quantity of data as it’s easy but when you would like to let’s say to a food food tables scanning our joining the internal objects and external objects together that would may require excessive tolling of the data so we’re working with actively or names of a local caching approach allowing us to support with reports right the custom report eyes on the external objects it is this this functionality is belong to the same category of the functionality that we trying to complete the solution by the end of the year and then probably you’ll be expecting both triggers workflows and reports will be available for use by the end of the year and then I think the the second half of course is related to a wave so having as some of you know social soil our wave analytic clow late last year and then the lightning connector has been used actually in conjunction with that some of our customers been using lightning connector pump the data from external databases into waiver I using it just purely as a thing about as my connecting tissue to be able to allow them to access the the external system and then in turn and pushing the data into the waves analytical before the same thing thing token right so there’s also other customers using the same approach you can think about creating some sort of like their own and hot etl solutions using lightning connect to pull data from my external weather SI p or call microsoft system and then using a piece of apex code you replicating the data from the external objects into a custom object we have also seen use cases like that so this is more I belonging to the category of the use cases that some of our customer using lightning connects as the connectors to pushing systems among the Salesforce I which is a little bit different than the real time access use cases that we typically present to your customer excellent Jason thank you we have another question that’s come in there thing about and I use DataDirect cloud and lightning connect to link my spreadsheets on Google Drive or on onedrive and bring that data into Salesforce Google Drive or onedrive so we have seen a few requests for data sources that are outside what we would consider our traditional data source model those document repositories and

today we don’t have that capability that’s something that we would consider putting on the road map if there was significant interest in it I’d be very curious offline to learn a little bit more about the specific use cases around that and and the type of specific documents or data that folks want to bring in that way all right thank you Jason thank you matthew excuse me can we bring data from multiple sources for example sequel server and oracle into a single object in salesforce can that object be populated with data for more than one external source so today there is that’s currently something that we are working on we expect to release that in the very near future I believe with our next release that is the capability to take within DataDirect cloud more than one of a configured data sources and it doesn’t have to be the same kind right one could be Oracle one could be sequel server one could be google analytics to pull those in to a group and then expose instead of the individual data sources instead expose the group with an o data endpoint and then present that to Salesforce so there are you know probably a few different use cases where that makes sense and that’s something that that we do expect to release very soon map I can edit that that’s the capability that will not be supportive of box from Salesforce right a sec sorry one of the example where DataDirect cloud can play contento y values where a customer will be able to construct in composite objects right among different data sources and presented at a single virtual Oh data fit to Salesforce because on a Salesforce side we wouldn’t be able to construct a composite objects from different data sources will be lying on a middleware solutions such as bait our cloud to be able to fill the gap so that’s another example why a certain use cases or their certain a certain areas that would be much easier to be handled by a direct exactly thank you the questions asked k an external data be integrated with facilities other than the sales cloud can I use it from portals or from service cloud things like that I think that is the more related to sales force the quaint answer to that yes right external object is just like a a personal object to be honest right so pretty much whatever you would be able to use for customer objects you will be able to do to use it for external objects you can notice on the sales cloud you can see in the service cloud you’ll be able to locate in the consoles and then you can you will be able to be with yours pages you can do you can construct your search using this or to create your crew languages or search languages against it the only pic subtle difference is set for a mere for all the external objects none of the data is being stolen car in Salesforce databases may be trying to objects being access we need to make a new album web services call to which over the internet which your data and poon are two to fetch data of Internet you to into your Salesforce so the data is always say ameen fash and I mean you will not be looking at some of years they’re outdated data where the same at the same time that means that performance huge because at the end of the day which you need to go over the internet reported data from the data source that makes it very important that you choose a middleware that will be giving you the good performance and also with crystal ave t thanks jason i think we’re reaching our last question can you describe any limitations the the trials that have been mentioned that try DataDirect cloud and the sandbox use of light and connect on the Salesforce I know their limitations in terms of what a user can do with those yeah I think they’re they’re probably a few I mean in both cases they are designed to be development test evaluation environments they’re not meant to be production environments I think that’s the first and maybe should be obvious to state in terms of data direct cloud we do not have a lot of limitations within the 30-day free trial so in other words you can set up multiple data sources you can configure any of our supported data sources there will be some bandwidth limitations just as they are with the production accounts so those are some of the things that will be slightly different within data dark cloud on the sandbox side at least in terms of

testing lightning connect myself I haven’t come across any any specific limitations there Jase I don’t know there’s anything that you want to add in terms of sandbox limitations on the Salesforce side not so much about limitation I lighting connects the chi a product as Matthew you mentioned earlier it is available in our developer edition organizations meaning today you go to request way a deos on the Salesforce I the features already enabled by default on that all right from the functionality perspective all the features are there the only difference between a Matthew mentioning that develop editions menu for development purposes the only difference is singed is we are limiting the number of transactions or the capacity you can make out from the ER but other than that from the functionality visualize it’s exactly the same as what you will see in a enterprise or above additions from a sandbox perspective for at the same time there’s no difference right between a sandbox also enable with a lightning connect rile which is a photo journal I think the only difference is that there could be a child period I could be a dude a 60 days that you would be able to do challenges features until the trial license expired I think that’s pretty much the the only difference between a sandbox trial was is a full production all thank you Jason and thank you Greg for passing those questions along I think we’re just about to wind up the the hour here so again let me extend my gratitude at my thanks to to Jason Choi for joining us today talking about lightning connect Greg thank you for going through those questions with us I’m going to turn it back over to Marlys and wish you all a very nice rest of your day all right well thank you everybody and today’s session was recorded and will be available through an email that will be sent out to everyone that has registered and again thank you for joining us and we look forward to seeing you on our next webinar