Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

AiThority Interview with Naren Narendran, Aerospike’s Chief Scientist

Real Time Data Processing With Aerospike

Aerospike

Real-time data processing involves ingesting, processing, and analyzing data as soon as it is generated. This is particularly important in applications where timely insights are crucial, such as financial trading, IoT systems, and monitoring and analytics.

Aerospike excels in handling real-time data processing due to its architecture, which is optimized for speed and low latency. It uses a combination of techniques like in-memory data storage, distributed processing, and SSD-based storage to achieve high throughput and low latency. Additionally, Aerospike supports features like strong consistency, automatic sharing, and intelligent data distribution, which are essential for real-time data processing applications. Let’s hear from Naren the best about this platform.

Hi Naren, welcome to AiThority.com’s Interview Series. Please tell us about your tech journey.

I’ve been in the tech and computer science space for over 40 years starting with my undergraduate degree in Computer Science (a fairly new field at that time), followed by a PhD in Theoretical Computer Science. I’ve always been interested in algorithms and the application of them to technology. Over the years I’ve diversified into a wider scope of taking those algorithmic ideas and translating them into high-performance systems at scale, starting with when I joined Bell Labs Research. So many fundamental projects came out of there, and it was a collaborative environment where I could take my theoretical expertise and apply it in practical areas since I was developing algorithms for hardware and software being built by others in the group.

After Bell Labs, I worked at an internet and e-commerce startup, then at Google in New York. The office had less than 50 people when I started and about 5,000 when I left. It was so interesting to be part of that growth. I feel grateful to have been part of foundational projects, which are now components of Google’s infrastructure and Google Cloud.

I later worked at Amazon, where I got more involved with AI and ML in organizing the company’s e-commerce catalog so that users could more easily browse and locate the items they needed. Now at Aerospike, I’m continuing my work on a large-scale, high-performance system.

What are the major challenges in making AI more accessible by reducing server footprint? How do you overcome such hurdles?

One significant challenge lies in these new large language models – they’re really more like huge language models. They work with tens or hundreds of billions of parameters, which means they take incredible amounts of compute infrastructure to run — and even more so to build them and keep them up to date and trained. Currently these sizable LLMs can only be built by large cloud providers or companies with significantly deep resources. It’s not very accessible right now, and even though it may be a viable way forward, it’s not clear how to solve this issue in an economical way.

One possibility is to have more, focused, smaller models that consume less power to develop and run, and thus be more democratically accessible by a wider range of people. It may be possible, for example, to develop custom models for specific domains that work with far fewer parameters than today’s largest LLMs, consuming much lower compute resources, and maybe even working better because they are trained in a more focused manner.  From Aerospike’s perspective, one of the things we do particularly well is to lower the total cost of ownership (TCO) by packing large datasets into smaller server footprints, and we think this advantage will carry over well into the AI space as well.

As an AI leader in Real-time Data Management, which industries do you think will be the fastest to adopt AI/ML?

Industries where returns are inversely proportional to response time will benefit the most and would likely be the fastest to adopt AI/ML. These industries include finance, where quick decisions often bring the best results, and fraud detection, where you need to act very quickly in a critical moment. There are other real-time use cases like self-driving cars where you have to react instantaneously to changing road conditions or unexpected obstacles. Aerospike is already being used in many of these contexts today. For example, Paypal has been using an Aerospike cluster to power their fraud detection systems that need to make decisions on whether to approve specific transactions in a split-second.

How do you bring together people and technology in one place to deliver hyper-personalized user experiences?

The short answer is data — and as much of it as possible. Rather than answering questions on a generalized statistical basis, you want to generate answers that are very specific to the user or the person who’s asking the question. Rather than generating an answer based on statistics from a behavior over the last year or even month, you want to use a hyper-specific context from the very recent past.

What did the user do in the last two minutes?

How do they interact with the system?

What did they search for three hours ago?

Using all of that information, which means a whole lot more data, you feed that into the ML system, going from generalized statistical predictions to hyper-personalized ones. That translates to seeing more data, consuming more data, and acting on more data all in real-time on an individual contextual level.

Which industries see a higher tendency to be victims of a data breach? What are the factors that contribute to being a target?

The more data a company has, the higher its risk factor. Finance and Banking have historically been targets, but now with more companies in all domains storing much more user data to drive their AI/ML workflows, the potential of leaking sensitive data is dramatically higher.

Are AI-companies more at risk? If so, how?

AI companies storing massive amounts of data about users and their activities have a higher risk factor because more data could, in theory, be leaked. Also, since AI and ML are so hot right now, there’s intense competition in this space, and intellectual property breaches could be more likely because of bad actors trying to target companies and capitalize on their IP through unethical means.

What are your predictions for the AI domain for 2030?

By 2030, we will have experienced a shift towards more focused models. While everyone today wants to jump on the latest LLM  bandwagon, they are incredibly resource-intensive and resource-greedy. That’s not a feasible model for everyone. LLMs will be simplified – demanding fewer resources and becoming more accessible.

Additionally, we will be much further advanced regarding vectors for semantic search and other more “classical” aspects of AI and ML. While LLMs received a lot of attention in 2023, they cannot do the job alone. LLMs represent just one corner of the AI world. Other critical segments are needed to complement the whole picture, so I think we will see more progress there.

Thank you, Naren! That was fun and we hope to see you back on AiThority.com soon.

[To share your insights with us as part of editorial or sponsored content, please write to sghosh@martechseries.com]

Naren has spent three decades in a variety of activities in the science and technology space – fundamental research at Bell Labs, working at startups in the email and advertising space, and leading engineering teams and launching new products and infrastructure in the areas of Search, Advertising, Storage, Networking, Distributed Systems, and AI/ML at Google and Amazon. Throughout his career, Naren has had a particular interest in adapting novel technological and scientific concepts to pragmatic applications in scalable and performant hardware and software. Naren has a B.Tech in Computer Science from the Indian Institute of Technology, Madras, and an M.S. in Math and a Ph.D. in Computer Science from the University of Wisconsin-Madison.

Aerospike unleashes the power of real-time data to meet the demands of The Right Now Economy. Global innovators and builders choose the Aerospike real-time, multi-model, NoSQL data platform for its predictable sub-millisecond performance at unlimited scale with dramatically reduced infrastructure costs. With support for strong consistency and globally distributed, multi-cloud environments, Aerospike is an essential part of the modern data stack for Adobe, Airtel, Criteo, DBS Bank, Experian, PayPal, Snap, and Sony Interactive Entertainment. A global company, Aerospike is headquartered in Mountain View, California, with offices in London, Bangalore, and Tel Aviv.

Comments are closed.