NetApp - Big Data and AI
AI explained: Deepfakes
From use case to infrastructure
Digitalization, production, customer relationships, transport, sensors, artificial intelligence (AI) and connected cars are creating increasing amounts of data. Successfully processing this usually unstructured information can unlock key insights and result in a major competitive advantage.
Furthermore: Big data analytics allows companies to draw important conclusions for their own business.

#team-netapp
Getting insights even faster with data science and artificial intelligence
Thanks to artificial intelligence and deep learning, we can now analyze data at an unprecedented speed. This delivers insights within just a few minutes in a way that previously would have taken days, weeks or even months. Despite this, we must not forget that artificial intelligence is based on human input. If these are incorrect for any reason, it is reflected in the result.
Deepfakes – the power of data
Have you ever heard of deepfakes? The term is a combination of deep learning, i.e., machine learning, and fake. A deepfake is when a person acts in front of a camera, but their entire appearance is recalculated so they look and sound like someone else, such as Donald Trump, Michael Caine, Al Gore, Mark Zuckerberg or Barack Obama. Deep learning involves the collection and analysis of huge amounts of data involving people, including photos, videos and audio. The more data that is available, the better a potential deepfake can be. Many of these fakes are incredibly convincing, with laypersons no longer capable of telling apart the real person from their digital doppelgänger.
The danger of deepfakes and the responsibility that comes from using them became clear in 2015, when a video of the Greek finance minister Yanis Varoufakis was faked to show him raising his middle finger. This led to a huge political scandal. When the German satirist Jan Böhmermann later claimed that he was responsible for the fake, barely anyone believed him.
Big data – the positives outweigh the negatives
We now know the big responsibility that comes with using big data. But we need to remember that big data and artificial intelligence can power many technological innovations and simplify processes that would normally be long and time-consuming. These technologies’ time has therefore come. The major cloud providers have recognized this: They offer corresponding cloud-based AI services and platforms that are easy to order and integrate.
NetApp and NVIDIA – a strong team
Now, you can unlock the full potential of AI and deep learning. You can simplify, accelerate and integrate your data pipeline with the NetApp ONTAP AI proven architecture, supported by NVIDIA DGX systems and NetApp all-flash storage with cloud integration. A data fabric architecture lets you reliably improve the data path between the point where it is created, the data center and the cloud, while also accelerating analyses, training and inference.
Why NetApp for artificial intelligence?
The use of AI technology creates huge amounts of data that needs to be smoothly transported through a five-stage pipeline. Even the tiniest bottleneck can stop the project in its tracks. This is why the combination of NetApp ONTAP AI, NVIDIA DGX servers and cloud-connected all-flash storage from NetApp delivers excellent AI performance, from data generation and analysis all the way to archiving.
Any Questions?
If you would like to know more about this subject, I am happy to assist you.
Contact us