This is Part 2 of How To Deploy Deep Learning for Enterprise, a version of the talk our co-founder Stephen Piron recently gave at NVIDIA’s GTC. Find Part 1 of our series on deploying deep learning for enterprise here.
After exploring the challenges enterprises face when introducing deep learning models, let’s map out some of the positive outcomes. To illustrate, we’ll revisit the story of how we collaborated with one of the world’s largest banks to deploy the first deep learning model for retail banking. After reading, we hope you walk away with a better understanding of what initial successes look like for deep learning deployment in enterprise.
We’ll also look at how working with the bank led us to articulate what we think is the key hurdle in successful adoption today — despite (or perhaps because of!) all the information available, most people don’t have a firm grasp of what deep learning or AI really is. Pinpointing this hurdle led us to establish our mission and deploy it by developing our deep learning platform Frontiers. With our mission underway, we’re excited to continue empowering people through understanding how deep learning works as well as its impact on business and at large.
DeepLearni.ng’s deployment of deep learning at the bank helped the enterprise emerge with an elevated understanding of how to approach their data to optimize deep the deployment of deep learning projects. Working closely together to execute the first use case enabled our team to transfer knowledge about how to prepare data to meet deep learning’s specific requirements.
The bank’s teams have taken this knowledge and run with it, and have also used the knowledge to begin to restructure the way data is collected, stored and managed. With more projects on the horizon, the bank will use knowledge gained from our work together to tailor incoming data for deep learning preparedness.
Being able to see deep learning in action also helps teams understand the technology’s key advantage: finally securing an ROI on data consolidation. Because successful deployment directly corresponds to the amount of context-rich data available, being able to witness the first use case in action proves the value of collecting as much data as possible. At the bank, our prediction model incorporated — and gave insights on — hundreds of more features than the conventional model. Introducing deep learning into the enterprise sparks the potential for a new way of thinking and structuring organizational data, which strengthens organizational ability to match the complexity of present and future challenges.
Deep learning’s unparalleled business value continues to become more and more apparent each day. As we’ve explored, the technology’s results can be benchmarked and measured rapidly, offering organizations novel opportunities to transform data into actionable insights. But there are also caveats. The complexity of large organizations makes experimenting with new technology extremely difficult to coordinate and deploy. Before our work with the bank, the infrastructure was not sufficient for deep learning projects. By providing a cloud GPU system and software tailored for the bank’s needs, we greatly enhanced their team’s ability to experiment with the technology without compromising rigorous demands for security and compliance.
Many enterprise leaders and employees today are also understandably concerned about the potential deep learning has to negatively impact the job market. This was definitely the case at the bank. While caution about deep learning’s potential to negatively impact working conditions is important to think about, there’s another side to the story that’s talked about less frequently worth celebrating: deep learning’s potential to free up human resources through automation, allowing employees to work on higher level projects that drive value for both individual and organizational growth.
We have already seen this side of deep learning’s impact on resources at the bank. Our provision of access to knowledge and tools about machine learning has allowed these teams to free up time and resources through the automation of tasks usually performed by humans. These employees already have had opportunities to work more intensively on projects that leverage human abilities that machines don’t possess. In addition, teams can work together to develop projects that focus on making the most of each individual’s unique set of strengths.
We have also helped the bank develop strategies to hire data scientists and engineers with the right skill sets to follow the framework we’ve left behind post-deployment. Since our collaboration, the bank has taken on multiple new employees with these skills to bolster their execution of future deep learning projects.
Continuing to collaborate with our team and also beginning to work independently, the bank is now developing a pipeline of projects that will make use of each of our design patterns: Clustering, Reinforcement and Prediction.
Clustering has a wealth of possible applications at the bank, offering an unprecedented opportunity to acquire a close understanding of what drives customer loyalty. Our Reinforcement pattern has been explored as a model for monitoring changes to account activity in real time. The Prediction pattern that the bank first deployed continues to offer sustainable impact, with many other applications for its frameworks now being investigated. The great thing about each of the machine learning patterns is their re-usability, and the bank is now taking the foundational parts of the initial model and repurposing their abilities for new projects. In addition to the bank’s internal exploration and deployment of deep learning models, we’re continuing to work with them to make these future applications possible.
We have also encouraged employees at the bank to think realistically about deep learning’s current and short-term capabilities. Acquiring a firm grasp on what deep learning can and cannot do is critical when developing a pipeline of projects, and employees at the bank have used knowledge they developed during our initial training sessions to inform planning for future projects. Following our initial period of collaboration, teams have started to see how smaller and more targeted use cases fit into the bigger picture of developing a deep learning pipeline.
After working with the bank, we emerged with a better understanding of the following important and solvable problem: Most people don’t know what machine learning is. But it’s not really people’s fault. Given the profound amount of hype and misinformation swirling our field, it really takes a lot of effort to sort out valuable information from what’s not. On the top of that, without technical expertise, it can actually be quite a struggle to grasp how the technology works.
We took this discovery and transformed it into our mission for our platform software Frontiers. Currently, our team of machine learning experts use Frontiers to achieve two primary objectives: first, to execute all steps of deep learning deployment, from data wrangling to the monitoring of results, and secondly, to show rather than tell our enterprise clients how the technology works. Continuing to iterate Frontiers, we are striving to provide the keys to what are very big and important questions, including: What is machine learning? And why is it about to transform our world? In order for machine learning’s impact to be easily understood by everyone, we know we will have to dedicate ourselves to developing people’s ability to understand the technology’s current capabilities and limitations.
With all of this work underway, there’s a huge positive impact for us as well. To sum up with a quote from Einstein: “If you can’t explain it simply, you don’t understand it well enough.” By explaining machine learning to non-experts (and it’s definitely a challenge!), our technical team has had constant opportunities to improve their own understanding of the technology and its value.
With our mission set and an already measurable impact, Frontiers will become a platform that’s ultimately designed to help non-technical users’ navigate machine learning and cultivate its revolutionary value.