AiThority Interview with Deepak Tewari, CEO at Privately
Could you tell us your journey and what inspired you to start Privately?
About 6 years ago, while working at an infosec company I had the realization that most webtech free services (social, search etc) have built businesses based on covert collection of user data- replete with all sorts of tricks to maximize engagement (auto play Youtube, Streaks –Snapchat) – all this way before the Cambridge Analytica scandal. This is neither ethical nor fair to the end-users who are largely oblivious of their digital footprints and how their information is being sold to questionable third parties and how they are being ensnared by seductive tech to keep engaging with these apps at the cost of sleep, exercise, and wellbeing.
This sort of environment poses elevated risks for children in particular. So we created tools that help kids have a balanced and healthy interaction with their online environment by empowering educating and safeguarding them.
What is Privately? What technologies do you use at Privately?
Using Privately’s technology, App developers, Gaming companies, Web Platforms, and Device Manufacturers can deploy child-centric online safety solutions focused on building resilience in children through a combination of self-help tools, awareness, and education. By empowering children to better manage their own digital lives we can together develop better digital citizens who engage positively with digital media.
At the heart of Privately’s OWAS stack is Machine Learning (ML) classifiers that understand text, images and identify threats, risks, and sentiments. We use a range of technologies from Keras and Tensorflow frameworks providing ML function inside iOS and Android mobile environment to Scala/Akka and Play.
What is e-safety? Why do you think it is important?
E-Safety is often used with different interpretations by stakeholders – parents, teachers, governments, academia, etc. In my opinion, e-safety or online safety is a subset of online well being. A few years ago, companies providing pornographic filters were thought to be e-safety.
Kids now interact within social media and message apps – often encrypted communications. They are not just dealing with external threats but also from things like excessive screentime. This demands a fresh look and definition of e-safety to tackle screentime, bullying, grooming, inappropriate sharing, privacy violations, etc. and have a holistic approach to foster wellbeing.
What are the applications of your AI-based assistant Oyoty? Is it only child-centric or universal?
Oyoty is actually a demonstrator of the use case of a personal wellbeing assistant for kids. It uses our entire technology stack. We have built an assistant for children so far but we know there are other demographics like elders and other use cases like corporates.
How does this technology integrate with user’s device? What different devices is it accessible to?
The technology is integrated into an app that has different ways of access to data – e.g own keyboard, access to photo library and access to social media accounts. A number of different adaptations can be seen, Oyoty and other apps like the BBC (BBC OwnIT) and Projuventute (147) that are using large parts of our technology stack for their own services.
How does this assistant detect threats? Does it work on keywords?
The system uses a mix of ML, observational and dictionary-based assets to predict private information being shared, inappropriate sharing (e.g. nudity in images), sexual conversations (grooming), bullying excessive technology use, compromised sleep, lack of movement. It can also detect and observe sentiments both positive and negative which can be a predictor of overall wellbeing of children.
How does the assistant process threats? What steps does it follow?
With each use case, the idea is to empower/educate the child. When a threat is detected – the child is either presented with options to deal with the threat and/or education that help them understand and deal with that threat.
On what different platforms can one use this assistant?
Which geographies are you currently focusing on? Could you let us know the feedback on how they are adopting?
We are focusing on the UK and Switzerland in 2019. There are projects in Central Europe and the USA which will probably launch in 2020. The services are recently launched by the BBC and the feedback has been very good. I think there will be evidence and information available at scale in the next six months.
How do you deal with data privacy and privacy regulations?
This is probably the most important question here. Our huge USP is that our Machine Learning models work entirely within the smartphone environment (EDGE AI) and therefore user data and user privacy is completely protected. Thus our tech can be deployed within smartphone environments of apps/games without needed to send user data to any sort of headend.
Read More: AiThority Interview with Lisa Woodley, VP of Customer Experience at NTT DATA Services
What are other areas can possibly benefit from this AI-based assistant?
As mentioned before using Privately’s personal assistance technology, App developers, Gaming companies, Web Platforms, and device manufacturers can deploy child-centric online safety solutions focused on building resilience in children through a combination of self-help tools, awareness, and education.
What is your future Product roadmap?
We will be integrating close to some consumer mobile security solutions and also within some leading games in the market.
What digital technology start-ups and labs are you keenly following?
We are looking at companies that are doing exemplary work in Machine Learning and that clearly involves the big players, but also companies that are working in age verification and user privacy.
What technologies within your industry are you interested in?
New ML techniques, frameworks, privacy-preserving techniques, federated learning.
The Good, Bad and Ugly about AI that you have heard or predict.
AI is definitely transformational for many industries. In some cases, it might lead to job loss as predicted but I believe many new ones will also be created. AI will allow us to reach new frontiers in health, design, transport… I do not think there is any way but forward. The discussion about robotic overlords who subjugate humanity is a bit alarmist in my opinion.
What is your opinion on “Weaponization of AI”? How do you promote your ideas?
This to me is the real problem – far bigger than AI taking over the world. Malevolent human actors, institutions will definitely weaponize AI sooner or later. I see it as a continuum of the current conflicts that exist all over the world and AI will be an important weapon in arsenals.
What’s your smartest work related shortcut or productivity hack?
Each day have one most important thing you would like to finish and cross it off at the end of the day.
Tag the one person in the industry whose answers to these questions you would love to read.
Read More: AiThority Interview with Mike Anderson (Tealium) and Momchil Kyurkchiev (Leanplum)
Thank you, Deepak! That was fun and hope to see you back on AiThority soon.
Deepak Tewari is the CEO of Privately, the AI startup behind the ground-breaking new online wellbeing and safety software suite (OWAS), the world’s first to be integrated directly into apps, games and devices, to empower and educate children to have a positive relationship with digital platforms
Privately SA empowers children to be safer online.
Online child safety solutions have long been synonymous with parental control, monitoring and restrictions. Apart from leaving children out of the learning process and promoting ‘underground’ behaviors such solutions are largely ineffective and even counter-productive as per the latest research from the London School of Economics.
With proliferation of tablets and smartphones among very young children and growing use of online applications- it becomes ever more important to involve the children in managing their online safety.
Privately’s OYOTY brings Swiss innovation to create a child-centric online safety solution focused on building resilience in children through a combination of self-help tools, awareness, and education. By empowering children to better manage their own digital lives we help develop better digital citizens who engage positively with digital media and are better able to leverage the wonderful possibilities that the Internet has to offer.
Copper scrap volume forecasting Scrap copper transaction management Scrap metal reclamation operations
Reception of Copper cable scrap, Scrap metal pricing strategies, Copper scrap furnace technology
Metal waste inspection services Ferrous material recycling licenses Iron scrap reclaimer
Ferrous material scrap sorting, Iron scrap reclaiming plant, Scrap metal recycling center