Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

DARPA and IBM Secure AI Systems from Hackers

DARPA and IBM’s Collaboration

The US Department of Defense’s (DoD) research and development arm, DARPA, and IBM have been collaborating on several projects related to hostile AI for the past four years. The team from IBM has been working on a project called GARD, which aims to construct defenses that can handle new threats, provide theory to make systems provably robust and create tools to evaluate the defenses of algorithms reliably. The project is led by Principal Investigator (PI) Nathalie Baracaldo and co-PI Mark Purcell. To make ART more applicable to potential use cases encountered by the US military and other organizations creating AI systems, researchers have upgraded it as part of the project.

Read: Data monetization With IBM For Your Financial benefits

Famous Machine-Learning Model Structures

With the hope of inspiring other AI experts to collaborate on developing tools to safeguard AI deployments in the actual world, IBM gave ART to the Linux Foundation in 2020. In addition to supporting numerous prominent machine-learning model structures, like TensorFlow and PyTorch, ART also has its own GitHub repository. To continue meeting AI practitioners where they are, IBM has now added the updated toolkit to Hugging Face. When it comes to finding and using AI models, Hugging Face has swiftly risen to the top of the internet. The current geographic model developed with NASA is one of many IBM projects that have been made publicly available on Hugging Face. Models from the AI repository are the intended users of Hugging Face’s ART toolset. It demonstrates how to include the toolbox into time, a library utilized to construct Hugging Face models, and provides instances of assaults and defenses for evasion and poisoning threats.

Read: Top 10 Benefits Of AI In The Real Estate Industry

Related Posts
1 of 40,517

Challenges

The researchers in this dispersed group would use their standards to assess the efficacy of the defenses they constructed. ART has amassed hundreds of stars on GitHub and was the first to provide a single toolset for many practical assaults. This exemplifies the community’s cooperative spirit as they strive toward a common objective of protecting their AI procedures. Although they have come a long way, machine-learning models are still fragile and open to both targeted attacks and random noise from the real world.

A disorganized and immature adversarial AI community existed before GARD. Digital assaults, such as introducing small disturbances to photos, were the main focus of researchers, although they weren’t the most pressing issues. Physical attacks, such as covering a stop sign with a sticker to trick an autonomous vehicle’s AI model, and attacks where training data is poisoned are the main concerns in the real world.

New: 10 AI ML In Personal Healthcare Trends To Look Out For In 2024

Researchers and practitioners in the field of artificial intelligence security lacked a central hub for exchanging attack and defense codes before the advent of ART. To accomplish this, ART offers a platform that enables teams to concentrate on more particular tasks. As part of GARD, the group has created resources that blue and red teams can use to evaluate and compare various machine learning model’s performance in the face of various threats, such as poisoning and evasion. What is included in ART are the practical countermeasures against those attacks. Although the project is coming to a close this spring after four years, it is far from over.

[To share your insights with us as part of editorial or sponsored content, please write to sghosh@martechseries.com]

Comments are closed.