Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Data Center Metamaterial AI Chips from Neurophos Raised $7.2 Million

Neurophos has raised a $7.2M USD seed round to productize a breakthrough in metamaterials and optical AI inference chips. Neurophos has joined the Silicon Catalyst incubator program to accelerate product development.

Neurophos, a spinout from Duke University and Metacept Inc., has raised a $7.2M USD seed round to productize a breakthrough in both metamaterials and optical AI inference chips.

The company has been funded in a round led by Gates Frontier and supported by MetaVC, Mana Ventures, and others. The seed funding will enable the production of a proprietary metasurface that serves as a tensor core processor enabled by its advanced optical properties. The company will also hire a team of engineers in Austin, Texas, a major silicon engineering hub.

AIThority Predictions Series 2024 banner

Recommended : AiThority Interview with Gary Kotovets, Chief Data and Analytics Officer at Dun & Bradstreet

While GPUs have had massive success in accelerating AI workloads, digital approaches are typically limited by power consumption. On the other hand, proponents of optical computing techniques claim that photonics can vastly reduce power consumption and therefore accelerate compute speeds far beyond the bounds of what is possible with modern GPUs.

Unfortunately, despite vast amounts of capital having recently been invested in optical compute for AI, the success of the field has been limited, largely because existing optical devices are too large and bulky to scale. However, metamaterials enable new paradigms for controlling the flow of light. The discovery of metamaterials has unleashed an enormous burst of creativity, leading to demonstrations of invisibility cloaks, negative refractive index materials, and many other exotic products.

Neurophos’ optical metasurfaces are designed for use in data centers and their approach is already shattering world records in computational energy efficiency. Neurophos plans to use high-speed silicon photonics to drive a metasurface in-memory processor capable of fast, efficient, AI compute.

The estimated global data center electricity consumption in 2022 was 240-340 TWh(1), or around 1-1.3% of global electricity demand, and the exponential growth of AI inference workloads is threatening to push this demand to unsustainable levels. Neurophos’ technology will provide way more compute per dollar spent on CAPEX and OPEX, at the same energy consumption, and reduce the total cost of ownership of AI accelerator chips and data centers.

Says Patrick Bowen, Neurophos CEO: “The most important factor in optical processors is scaling. Optical processors become both exponentially faster and more energy efficient on a per-operation basis as you make them larger. This means that in a finite chip area, the most important factor is how small you can build the optical devices that compose your processor. By leveraging metamaterials in a standard CMOS process, we have figured out how to shrink an optical processor by 8000X, which will give us orders of magnitude improvement over GPUs today.”

Alexander Hayes, co-founder Metacept Inc., says: “Leaving the speed and energy use bottlenecks behind by deviating far from the Von Neumann architecture represents one of the most exciting and potentially important metamaterial and photonic applications we’ve ever considered.”

MetaVC Partners provided Neurophos’ initial funding and an exclusive license to the fund’s metamaterials IP portfolio for optical computing. Neurophos was spun out of Metacept, an incubator led by David R. Smith, James B. Duke Professor of Electrical and Computer Engineering, focused on creating metamaterials-based companies and collaborating with Professor Smith’s research group at Duke University. Neurophos CTO Tom Driscoll previously founded metamaterials-based radar company Echodyne.

Says David SmithDuke University: “The Neurophos team has realized that the really fundamental problems of analog inference processing require a breakthrough at the level of the fundamental physics of the optical modulators. Their metamaterial is a ground-up breakthrough enabling an extraordinarily dense computing chip for next-generation AI applications.”

Related Posts
1 of 40,980

Recommended : AiThority Interview with Jenni Troutman, Director, Products and Services at AWS Training and Certification

Neurophos AI chips can be fabricated using standard CMOS processes. This gives easy access to volume manufacturing.

The company is also joining Silicon Catalyst, the world’s only incubator + accelerator focused on semiconductor solutions, (including Photonics, MEMS, sensors, IP, materials & Life Sciences) to accelerate startups from idea through prototype, and onto a path to volume production. Silicon Catalyst has developed an unparalleled support ecosystem for its semiconductor start-ups, providing a strong network of financiers, business advisors, and industry professionals who help companies to launch and scale in the market. In addition, the incubator provides privileged access to services, expertise, and intellectual property that can empower their companies’ technological innovation.

Paul Pickering, Managing Partner, Silicon Catalyst says: “Neurophos represents much-needed progress in analog optical computing, bringing the performance of silicon photonics to the existing manufacturing infrastructure of CMOS foundries. We are confident that they will be one of the leaders of the next generation of AI hardware. This is how you get to tomorrow quickly and without wasted capital. We are thrilled to have them in the program.”

Neurophos Breakthroughs In Depth

Neurophos’ advancements will decrease the size and energy needs of silicon photonic optical chips, making them more suitable for running artificial intelligence platforms such as LLM (Large Language Models).

Neurophos’ metamaterial-based optical modulators are more than 1000 times smaller than those from a standard foundry PDK (Process Design Kit). This enables a technology roadmap to deliver over 1 million TOPS (Trillions of Operations Per Second) of performance. For comparison, an Nvidia H100 SMX5 today delivers at most 4000 TOPS of DNN (Deep Neural Network) performance.

Optical chips have the potential to increase processor speed while reducing power massively. Neurophos will enable this technology to be used in AI data centers. That market, which is dominated by Intel and Nvidia, currently uses traditional silicon semiconductors that create enormous amounts of heat and are struggling to scale to the performance demands of LLM for AI.

Neurophos combines two breakthroughs. The first is an optical metasurface that enables silicon photonic computing capable of ultra-fast AI inference that outstrips the density and performance of both traditional silicon computing and silicon photonics.

The second is a Compute-In-Memory (CIM) processor architecture which is fed by high-speed silicon photonics to deliver fast, efficient matrix-matrix multiplication, which make up the overwhelming majority of all operations when running, for instance, a neural net.

The metasurface-enabled optical CIM elements are thousands of times smaller than traditional silicon photonics modulators, enabling the processing of vastly larger matrices on-chip. This results in an unprecedented increase in the computational density. In optical computing, energy efficiency is proportional to array size, so Neurophos’ processor is hundreds of times more energy efficient than alternatives.

Latest AI Interviews: AiThority Interview with Dr. Karin Kimbrough, Chief Economist at LinkedIn

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.