Artificial Intelligence | News | Insights | AiThority
Bitcoin
$11,815.36
+295.54
(+2.57%)
Ethereum
$440.79
+46.62
(+11.83%)
Ripple
$0.30
+0.02
(+8.48%)
Litecoin
$57.41
+3.32
(+6.14%)
EOS
$3.25
+0.21
(+6.77%)
Cardano
$0.14
0
(+2.58%)
Stellar
$0.10
+0.01
(+7.14%)
NEO
$15.22
+0.72
(+4.97%)
NEM
$0.06
0
(+1.53%)
DigitalCash
$92.01
+2.89
(+3.24%)
Tether
$1.00
0
(0%)
Binance Coin
$22.95
+1.57
(+7.34%)
QTUM
$3.05
+0.24
(+8.36%)
Verge
$0.01
-0
(-4.69%)
Ontology
$0.83
+0.03
(+3.31%)
ZCash
$85.68
+4.57
(+5.63%)
Steem
$0.23
-0
(-1.08%)

How Europe’s Researchers Are Using GPUs to Answer Science’s Biggest Questions

0 4

The conference, now in its third year, is a celebration of groundbreaking GPU-accelerated work across the region. Nearly 300 developers, startups and researchers took the stage, sharing innovative projects. Among them were some of the major science centers in Europe, spanning fields as diverse as particle physics, climate research and neuroscience.

Understanding the Universe

Técnico Lisboa, Portugal

Nuclear energy is generated through nuclear fission: the process of splitting apart an atom’s nucleus, which creates both usable energy and radioactive waste. A cleaner and more powerful alternative is nuclear fusion, the joining together of two nuclei.

Read More: Interview with Ben Goertzel, CEO at SingularityNET

But so far, scientists haven’t been able to sustain a nuclear fusion reaction long enough to harness its energy. Using deep learning algorithms and a Tesla P100 GPU, university researchers at Portugal’s Técnico Lisboa are studying the plasma shape and behavior that takes place in a fusion reactor.

Gaining insight into the factors at play during nuclear fusion is essential for physicists. If researchers are able to predict when a reaction is about to be disrupted, they could make changes to take preventive action to prolong the reaction until enormous amounts of energy can be captured.

GPUs are essential to make these neural network inferences in real time during a fusion reaction. The deep learning models currently predict disruption with 85 percent accuracy, matching state-of-the-art systems. By adding more probes that collect measurements within the reactor, and using a multi-GPU system, the researchers can reach even higher accuracy levels.

Read More: Palo Alto Networks Appoints Amit K. Singh as President

European Organization for Nuclear Research, Switzerland

Physicists have long been in search of a theory of everything, a mathematical model that works in every case, even at velocities approaching the speed of light. CERN, the European Organization for Nuclear Research, is a major center for this research.

Best known in recent years for the discovery of the Higgs boson, often called the “God particle,” the organization uses a machine called the Large Hadron Collider to create collisions between subatomic particles.

The researchers use software first to simulate the interactions they expect to see in the collision, and then to compare the real collision with the original simulation. These experiments require a system that can handle five terabytes of data per second.

“We are working to speed up our software and improve its accuracy, to face at best the challenges of the next Large Hadron Collider phase,” said CERN researcher Andrea Bocci. “We are exploring the use of GPUs to accelerate our algorithms and to integrate fast inference of deep learning models in our next-generation real-time data processing system.”

Read More: Interview with Angel Gambino, CEO and Founder of Sensai

Leave A Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.