[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

AiThority Interview with Yuval Fernbach, VP and CTO of MLOps at JFrog

Yuval Fernbach, VP and CTO of MLOps at JFrog, chats about the latest GenAI innovations that are making waves, tips for optimizing AI and ML workflows, AI  deployment best practices, and more in this chat:

———–

Hi Yuval – tell us about yourself and your journey as CTO.

My journey has always been driven by a deep passion for unlocking the potential of data and machine learning (ML) to shape future products. About a decade ago, I began to observe a growing trend of companies adopting machine learning to elevate and enhance their products. I saw firsthand how ML was transforming customer experiences in products like Netflix, Uber, and Amazon, particularly through recommendation systems and personalized solutions. At the same time, I realized how challenging it was for companies to build and scale ML solutions because of the complex engineering and infrastructure requirements.

This inspired me to co-found Qwak to simplify the ML process for companies and enable them to focus on their core business rather than the technical hurdles of building and managing infrastructure. After Qwak was acquired by JFrog last June, I became the VP and CTO of MLOps for JFrog. Now, as part of JFrog, I’m excited to help companies integrate ML into their development lifecycle alongside traditional software development.

Also Read: AiThority Interview with Nadav Eiron, Senior Vice President of Cloud Engineering at Crusoe

What latest GenAI innovations have piqued your interest?

For a long time, I have been fascinated by the impact of machine learning and AI, and I have helped companies integrate them into their software development life cycle. The innovation happening in areas like agentic AI, multi-agent systems, and multi-modal models capable of handling video and audio is incredibly exciting. These advancements are making AI more cost-effective, flexible, and accessible. I believe the future of software development will be driven by AI and ML, with the next generation of products and features being built using models and AI.

How do you feel GenAI will reshape the B2B SaaS marketplace in the next few years?

I think we will see a boom in SaaS companies that are built primarily on GenAI technologies, leveraging the capabilities of LLMs and other GenAI systems. Additionally, vertical-specific GenAI solutions will not only impact customer-facing applications but also drive innovation in B2B solutions across industries like cybersecurity and finance and beyond. Companies will develop specialized GenAI models tailored to the unique needs of their respective sectors. The improved ability of GenAI to handle diverse inputs and generate better outputs will lead to better integration and interoperability between different B2B SaaS products, and ultimately this will facilitate more efficient data exchange and collaboration. Finally, as the costs of training and deploying GenAI models decrease, B2B SaaS offerings will become more cost-effective, further accelerating adoption and growth in the sector.

When it comes to optimizing AI and ML workflows for innovators and users: what top tips would you share?

I believe there are a couple of key things companies can do to optimize their AI and ML workflows and drive real business value, while at the same time maintaining the flexibility and security required in today’s landscape.

  • Focus on business value, not infrastructure: Companies should focus on the business value and metrics they want to improve, rather than getting bogged down in the technical infrastructure and engineering challenges.
  • Prioritize adaptability: Given the rapid pace of AI/ML technology advancements today, it is important to have an architecture and workflows that allow you to quickly adapt to new models and solutions as they become available.
  • Develop the right metrics: Defining the right business metrics to measure the success of AI/ML applications is crucial. Don’t fall in love with specific technologies, instead focus on achieving the desired outcomes.
  • Test and validate use cases: Thoroughly validating the suitability of AI/ML for specific use cases is essential. Experiment with different models and approaches to ensure the technology is a good fit for that use case. One size fits all is not always the case here.
  • Treat ML models as part of the software supply chain: Similar to how companies manage the supply chain for their internal software development, they should apply the same rigor and practices to the ML models that are integrated into their products and services.

Also Read: The AI Revolution in Fintech – Funding Trends and Industry Developments in 2024

What foundational thoughts should teams keep in mind when deploying new AI tools for their processes / systems?

Adopting GenAI is a complicated process and it is important that companies have a plan before jumping in. The plan needs to include information about the products that will utilize GenAI and the architectures being built. While the open source community is becoming larger and is increasingly cost effective, it also introduces security risks. It is critical to consider the potential security implications of integrating any AI tool, including protecting against potential attacks, data integrity issues, and the risk of models being tampered with. Ensuring the robustness of AI systems is crucial. Deploying AI tools can also introduce new operational challenges beyond just the technical implementation. Teams need to think through the impact on workflows, potential downtime, and how to manage the availability of external AI services. Taking a holistic approach to deploying new AI tools will ensure a greater likelihood of success.

Top AI innovators that the industry should watch out for in 2025?

I think some of the biggest things we’ll see this year include the continued reduction in the cost of models and their usage, making AI more accessible for companies. This will be driven by advancements in more cost-effective models, including smaller models from companies like OpenAI. Additionally, multi-modal models—capable of handling text, images, videos, and audio—will become more standard, opening lots of doors and broadening their applicability across industries. Open-source models will gain wider adoption as companies seek greater control over their AI solutions, reducing their reliance on proprietary models. Lastly, concern over AI security will continue to grow, and I think we will see a greater focus on securing GenAI applications.

[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]

Yuval Fernbach is the VP and CTO of MLOps at JFrog

JFrog is a DevOps technology company revolutionizing software delivery with its liquid software approach, enabling continuous and automated updates from build to production, eliminating the need for endless software versions.

Comments are closed.