Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

World’s First High-Reliability Robotic Picking Cell for Transparent Poly Bags to Be Unveiled at Automatica

Three makers of next-generation pick-and-place robotic technologies unveil the world’s first high-reliability robotic picking cell capable of robustly handling transparent poly bags in high-variability, mixed-material environments. It will be on display in Munich at Automatica, a leading exhibit for smart automation and robots, at the Zivid booth (#B5300).

“Where transparent poly bags are involved in e-commerce fulfillment, pick and place automation has less than one percent market penetration,” said Preben Hjørnet, CTO for the Gripper Company. “The world’s largest e-commerce marketplaces have worked on this for years, but conventional grippers have always limited success rates to less than 70 percent. This collaboration more than doubles the best throughput speed currently available, with much higher success rates.”

Recommended: AiThority Interview with Brigette McInnis-Day, Chief People Officer at UiPath

Pairing the Zivid 2+ 3D camera with Fizyr’s deep learning computer vision software provides unparalleled robustness for handling transparent objects. Fizyr identifies each item with segmentation, shape detection and material detection, and its algorithms prioritize the actions to be taken with cascading considerations that identify which items are on top of others, the location and makeup of each item’s surfaces, whether the robot’s gripper will have adequate room to move, where it must begin its task and more. Fizyr then instructs the robot to optimally deploy MAXXgrip from The Gripper Company. After each pick, a new image allows Fizyr to recalculate, account for any changes that occurred, and direct the robot’s next step, all in a fraction of a second.

The robot never needs to slow down, consistently performing at its fastest possible speed thanks to instantaneous image capture, processing and robot instruction, and it can pick, move and place up to 1,200 fast moving consumer goods per hour in high variability environments, including transparent poly bags – which present many challenges for traditional grippers, with much greater success than leading in-market solutions. The first-of-its-kind robotic cell capably handles a wide variety of goods and packaging types, including shoeboxes (tuck boxes), books, telescopic boxes and stylepack apparel bags, which are known to cause fulfillment bottlenecks.

Related Posts
1 of 40,755

This collaboration integrates three next-generation technologies for the first time:

  • The Zivid 2+ 3D camera from Zivid: a new general-purpose 3D+2D camera for the highest performing robotic applications.
  • Deep learning computer vision software from Fizyr: the smartest, fastest and most effective brain available to maximize robotic capabilities.
  • MAXXgrip from The Gripper Company: a unique hybrid gripper that uses a smart strong pinch grip to prevent apparel from swaying to safely produce extremely high pick and place rates.

Latest Insights: How to Get Started with Prompt Engineering in Generative AI Projects

Robotics integrators, systems integrators and robot manufacturers partner with Fizyr, The Gripper Company and Zivid to deliver reliable and productive automation solutions for order picking, parcel induction, mixed-SKU depalletizing, loose loads trailer unloading and a wide range of other tasks in high-variability environments.

Latest Interview Insights : AiThority Interview with Abhay Parasnis, Founder and CEO at Typeface

[To share your insights with us, please write to sghosh@martechseries.com] 

Comments are closed.