Over the last year I have spent a significant amount of time with customers and partners discussing Artificial Intelligence (AI), some of the core patterns emerging in terms of initial implementations, and in many cases where and how to get started on the journey. While there has been, and continues to be, a lot of excitement around the potential for AI, people are starting to move from their initial, bespoke, research-oriented AI implementations, to wanting a more considered and pragmatic approach to their use of AI. In this area we’ve done a lot of thinking about not only the patterns for AI, but the journey itself.
This notion of real world or pragmatic AI stems from conversations with many customers and partners and their efforts to date in the last year. These projects typically align well with the patterns we’ve discussed, have been undertaken to both understand what is possible with AI and to learn more about the tools and platforms that are available. In many instances teams would take a set of data offline, build a model, a Tool, or a service/solution, test it out, even put it in production as a proof of concept (PoC) and drive some learning. Unfortunately, many of these projects don’t feed into the core data estate or pipeline, they didn’t quite generate enough ROI to justify scaling the work further, nor did they create a sustainable differentiation for the company. So, while they were great for learning, which is important in itself, they were not great for creating a sustainable AI asset. With good learning done, the question now becomes the best way to think about the where and how of creating a sustainable AI asset/capability that support the differentiation and growth of an organization.
In this area we’ve created a framework to think about that AI Journey that I recently tested with customers in Europe and here in Redmond. This framework seems to resonate very well with the people we’ve spoken with, so I wanted to share it more broadly and hear what others think. Note that the journey is not necessarily linear, it’s more about picking the right tools and entry point(s) for any given organization, and scenario, and to think about what makes the most sense from an investment and opportunity cost perspective. Like any new technology the question is not how to use it in every scenario all at the same time, but rather where can it be used to help a company grow and differentiate and where do you want to build core capability versus leverage the services or tools from others. With that in mind here are the four areas we have been discussing with customers/partners.
BI before AI
Data is the foundation for AI and without data there is no AI, so the first opportunity we have is to leverage the data we have within an organization. I like to think that it’s good to look for “Insight” before we try and drive “Intelligence” and thus the notion of “BI before AI”. Where does a company have unique data that can be leveraged? Where are their data sources that can augment that data….is a company data driven, do they make decisions based on data, etc. If not, that is a great place to start as data will be the cornerstone for much of a company’s AI work going forward.
With the hype for AI being so strong many people want to rush into AI before making sure their data estate is well structured. This step is both critical in obvious and subtle ways. Data is used to train models so obviously you need to have enough data to train accurate models without overfitting. More subtly is that it is important to have a diversified dataset to minimize the risk of bias. One of my colleagues Judson Althoff often says, “…without good data, all you will do with that fancy AI technology is make mistakes with greater confidence than ever before.” For larger organizations with a lot of data, or a long history of data collection, there will be work to do to determine which data to try and clean/normalize and use….and where to start over collecting data from scratch, in a way that best serves the insights and ultimately intelligence they want to drive. So having a plan for the data estate, and a path that balances driving insights while building out a sustainable data asset is a great entry point on the AI journey.
Along with cleaning and normalizing the data, getting the data in a standard format (e.g. common data model [CDM]) enables you to get insights into where best to prioritize your efforts. Normalizing your data with CDM and adopting tools like PowerBI makes it possible to not only get insights but to also easily create PowerApps and line of business tools.
SaaS AI Offerings
I have personally observed that when AI is used to help support some of the Core Business Processes it is easier to get organizational alignment and support for continued investment, which is always critical when seeking to invest new areas like AI. In this area we are starting to see teams use AI to augment standard business processes, and within Microsoft our team incubated 3 AI solutions (customer service, sales, marketing) in the last year that are now starting to become available to others through the Dynamics 365 AI offerings. In many cases, we start by applying AI on top of our BI. That is automating the understanding of what is inside the data we are collecting. The simple example of this is the use of AI to look for patterns and signals in the new customer service insights. On a daily basis the AI does clustering on the last time window of topics and compares that with other BI data to identify emerging trends.
By using AI to assist professionals in common business processes and scenarios, the organization sees real impact from AI and provides some proof points to consider for different use cases in the company. The previous examples were role specific but you can use AI to get a broader understanding of the organization with Workplace Analytics. In addition, there are many third-party SaaS services being created to support different horizontal and industry specific processes. These will become more important over time as the opportunity to purchase SaaS services with AI built in, or available as an add-on, will likely become more economical than trying to build your own AI capability for every business area/process within your organization.
Up until this point we have normalized our data to get insights on where to invest our efforts and leveraged AI in products that make core business processes more effective or efficient. The next opportunity is to leverage the new AI layer in the development stack to develop more custom solutions or services. In this area, the ability to leverage the tools within Azure AI to augment existing processes or solutions can be very powerful and are being tested within many organizations today. There are many examples within the Azure AI Gallery to explore what is possible. The capabilities offered through the cognitive services for vision, text, speech or knowledge, the BOT service, Azure ML and more provide a great starting point for infusing AI into your core offerings. In the same way Microsoft is using the Azure AI platform to infuse AI into our own products and other companies are also able to do the same thing for their products/services. In most cases, the core set of cognitive services will get you going and there is even the option to customize them when you need something unique to your use case. The advantage of this approach is that you don’t need to be a deep learning expert or have a lot of training data to get started. Simply call the API that you need or plug into the Azure Bot Service to get up and running quickly. The most common AI apps and services developed with these tools follow the 5 patterns for AI solutions.
Even with all the AI building blocks that make it easy to get started using AI in your apps and solutions there will be times that you may want to do something completely custom and build your own models and tools from scratch. In those cases, we offer the data infrastructure, management tools, computational power, and frameworks to accomplish what you need. Starting with the data we support a variety of data solutions from cloud hosted databases to data lakes. Furthermore, if you started with your data in a CDM schema you can use it in Azure Data Lake Storage generation 2 providing a clean way to scale up your data capabilities as needed. When dealing with big data the ability to process it quickly is important and Azure has support for CPUs, GPUs and FPGAs. For those of you who are not familiar with FPGAs they are customized to execute AI tasks very quickly so that you can lower your model training time but compared to CPUs they are less flexible. Today several frameworks are used for deep learning and Azure provides support for many of them. PyTorch, Keras and TensorFlow are popular ones but we also support the ONNX standard that will give your models portability if you need to move between ONNX supported frameworks. So, the tools for creating your own customer AI solutions are woven through the Azure platform and available for developers and data scientists to take advantage of today.
With the continuing evolution of AI, the opportunity to begin applying it to real world problems is here. Since there are multiple entry points for applying AI, the key is to determine an approach that creates both short term results and builds a long-term asset. Part of achieving that goal is determining where and how to leverage AI, and like all other investment decisions where to apply your scarce resources and where to leverage other tools or services to drive business outcomes. It’s also critical to think about where you have unique data assets and how to bring those into play as part of your overall AI journey and strategy. As you consider your next AI project start with the business outcome you are trying to drive, the depth of AI experience and the level of AI customization needed will help you determine where best to start. As I often say, “it is too early to do everything with AI but too late to do nothing” so get started and we look forward to seeing what you develop.
This post first appeared on MSDN Blogs | Get The Latest Information, Insights, Announcements, And News From Microsoft Experts And Developers In The MSDN Blogs., please read the originial post: here