AiThority Interview with Fernando Lucini, Global Data Science & ML Engineering Lead at Accenture Applied Intelligence
Hi, please tell us about your role and the team/technology you handle at Accenture?
As the Global Data Science & Machine Learning Engineering Lead for Accenture Applied Intelligence, I’m focused on helping clients scale AI, and creating solutions to meet our clients’ growing demand for data-led business outcomes. Our efforts go beyond Applied Intelligence practitioners and Accenture technologists, as we believe everyone at Accenture is responsible for bringing industrialized AI to life for clients.
Within Applied Intelligence, we’re finding new ways to reinvent what it means to be data-driven by scaling AI, analytics and automation—and the data that fuels it all— to power every single process and every single person across an organization. More specific to my team, we work with clients to develop and implement AI strategies and technologies that help their people and their organizations get more value from data, make better informed and more timely decisions, and maximize the investment they’ve made in technologies that power AI, like data infrastructure and cloud.
How have AI and ML models evolved in the last 2-3 years since the arrival of automation and IT virtualization capabilities?
We know from our research, and what we hear from clients, that 3 out of 4 organizations struggle with how to glean business value and truly scale AI. Even though the technology has quickly advanced, organizations are still working to keep pace. There’s been a move toward increased automation and reusability in terms of how models are created and maintained, including the creation of feature stores, model management, model lifecycle or hyperparameter optimization, that strives to raise the reusability, transparency, repeatability and consistency from data science organizations.
What are the biggest barriers in the adoption of AI ML techniques for a modern data-driven company?
We have seen organizations committed to a sort of perpetual proof of concept mindset. They run a number of successful POCs but haven’t aligned their AI strategy to their business strategy, and so are unable to turn these POCs into business value delivering initiatives integrated to the business’ systems and processes. You have to pilot with a plan to scale into your business – that means ensuring you have the right data foundation in place to power your AI, having the integration of AI into the business systems, ensuring you have the right level of data literacy and data-driven culture in your organization, and having all of this ladder up in support of a particular business challenge you are solving. Don’t just implement AI for the sake of experimenting – have a plan to succeed and drive provable business value.
What will Data Science look like in the next 2-3 years?
For today’s companies, the most formidable challenge in the next three years will be mastering the complexities of running an AI Ecosystem with numerous, interconnected AI models that each have their own quickly changing, complex dependencies. This ecosystem will live and breathe across a number of cloud providers, allowing for more opportunities for hybrid cloud environments.
Secondly, having all of this in place means, alongside optimization and scale, that our clients will have the tools to industrialize AI successfully that will allow for fast implementations and facilitate the machine learning development lifecycle for many companies. That said, this will not replace the human skills needed to perform error analysis, improve the performance of a given model or train groups of interacting models. Automation will accelerate the ways we develop machine learning systems, but it won’t guarantee the desired model quality; the latter will fall into the hands of data scientists, an important reminder human and machine must work together to achieve successful AI and ML integrations.
What kind of programming skills should data scientists / big data analysts have in order to gain a greater control of their models?
Being an expert in data science requires a variety of skills. Data scientists need deep mathematical knowledge in order to understand and master the machine learning models and methods available and apply them effectively. They must also possess strong computer science skills, to also master how models need to be engineered into scalable solutions in companies. Together these two areas of knowledge and skill require strong academic background, time and exposure to industry problems. While machine learning applied to industry will be more common in measure, deep expertise of machine learning applied to an industry will be limited and very valuable. Firms will be looking at consultancies and other companies for help plugging the skills gap and data scientists will need to constantly upskill and re-invent their core skills on an annual basis to stay current and relevant around their core competency and industry relevance.
Tell us more about NLP and how businesses are lining up to build NLPs for their software solutions?
Natural Language Processing (NLP) is an exciting area undergoing fast development and growth in implementation. Recent advancements have opened the door for a wider variety of businesses to utilize techniques and technologies that exploit text. With advances in transfer-learning that limit the need for very large datasets for many use cases, the barrier to entry for most use cases on unstructured data is likely to become lower. As a result, innovative and leading companies will successfully use the latest natural language models to fully automate offerings including customer care, alongside many other areas where an advanced understanding of the human language might have previously been an issue. We are living a renaissance of NLP, in the last 2-3 years we have seen incredible breakthroughs that will make use cases more achievable for more use cases in a difficult field – the field of how to use information that is high in context.
Your prediction on the future of No code?
We are already seeing a push in the software industry to take elements of machine learning, modeling, data prep, testing, benchmarking, etc and make them more available to lower-skilled teams and/or broader teams at the a lower price point (skills-wise) through the use of no-code. My prediction is that this will grow and continue to evolve to capture a growing number of conditions, scenarios and methods. It’s a positive evolution to adapt to user’s needs and it does not remove the need for deep expertise.
This should not be confused with Auto ML and other methods and techniques to automatically adjust methods/algorithms etc to a data set and objective. These might also have a strong no-code presentation but deserve to be considered differently.
Your prediction on the “future of cloud computing strategies”
The enterprise of the future thrives on cloud-enabled, data and AI-driven models with significant implications across the entire C-suite. And we’re at the forefront of this business need. In fact, Accenture recently announced that we’re investing $3 billion over three years to help clients across all industries rapidly become “cloud-first” businesses to help us accelerate an increasing need for digital transformation. It’s clear the acceleration to the cloud we’re seeing across industries will allow businesses to scale their AI faster, at a time when 84% of executives are telling us they won’t achieve their growth objectives without scaling AI across their organization. We like to think of cloud as our AI engine as the technology is otherwise only as good as the data and computing power that fuels it. Modern cloud computing strategies require innovative software architecture to create the patterns and standardization needed to integrate AI components quickly – and at scale – across all of the cloud providers and on-premise. As the demand of data storage and computing become greater, adopting cloud is the only way companies will be able to run successful models and businesses.
Thank you, Fernando! That was fun and we hope to see you back on AiThority.com soon.
Fernando Lucini is a Managing Director leading Artificial Intelligence for Accenture UKI. Fernando is a passionate and experienced senior leader with extensive experience in AI and ML technologies for Text, Speech and Video as well as opening markets, running large teams in technology and sales, strategic acquisitions and integration as well as runtime of large software and solutions companies. Previously Fernando spent 20+ years in the enterprise software industry latterly as Chief Architect and Chief Technology Officer of Hewlett Packard Enterprise’s Big Data and AI Software business, creating technologies to automate and understand Text, Speech and Video data and integrating these technologies into business solutions for Fortune 100 companies including Financial Services, High Tech, Pharma, and many others.
Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology and Operations services — all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 537,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.