Artificial intelligence (AI) is powering change in virtually every industry across the world. As individuals and companies are increasing the use of data-driven systems, the demand for AI technology grows. From data search and recommender systems to medical imaging and improved supply chain management, AI technology provides companies with the analytics systems, tools, and algorithms necessary to increase efficiency: preventing diseases, building smart cities, and revolutionizing analytics. These are some solutions made possible with AI, deep learning, and data science developed by NVIDIA. These technologies are empowering organizations to rework moonshots into tangible results.
Businesses use machine learning (ML) to enhance their products, services, and operations. Businesses may create models to forecast consumer behavior and optimize internal procedures by using vast volumes of historical data.
While ML provides incredible value to an enterprise, current CPU-based methods can add complexity and overhead, reducing the return on investment for businesses.
Now, with information acceleration platforms that mix optimized hardware and software, the usual complexities and inefficiencies of ML disappear. Data scientists can now iterate quickly on features, use large databases to make highly precise forecasts, and easily bring value-generating solutions to output. Data scientists can access GPU acceleration through some of the most popular programming languages like Python or Java, making it easy to modify projects within the cloud or on-premises.
Businesses use machine learning (ML) to enhance their products, services, and operations. Businesses may create models to forecast consumer behavior and optimize internal procedures by using vast volumes of historical data.
While ML provides incredible value to an enterprise, current CPU-based methods can add complexity and overhead, reducing the return on investment for businesses.
Now, with information acceleration platforms that mix optimized hardware and software, the usual complexities and inefficiencies of ML disappear. Data scientists can now iterate quickly on features, use large databases to make highly precise forecasts, and easily bring value-generating solutions to output. Data scientists can access GPU acceleration through some of the most popular programming languages like Python or Java, making it easy to modify projects within the cloud or on-premises.
By leveraging the availability of accelerated ML, businesses can empower data scientists with the tools necessary to get the most out of their data. The advantages of accelerated ML include spending less time waiting for applications to complete and more time evaluating solutions for a result that is 19 times quicker than the industry standard CPU-based approach. It can also accelerate and scale the current data science toolchain with limited code updates and no requirement to learn new software. The system can analyze extensive databases with high-performance processing to present accurate conclusions and faster reporting.(1)
Enterprise and Developer
CUDA: parallel computing platform. CUDA is a hardware and software technology that allows a graphics processor to use several computing cores to perform general-purpose mathematical calculations.
IndeX: commercial 3D volumetric visualization SDK allows to visualize and interact with massive data sets in real-time and navigate to the most pertinent data.
Iray: produces realistic images by simulating the physics of lights and materials.
Videos Games
BatteryBoost: ultra-efficient variant with up to 2x longer battery life.
GameWorks: NVIDIA’s award-winning GameWorks SDK gives developers access to the best technology from the leader in visual computing.
Architectures
Ampere: highest-performing elastic data centers and graphics cards for gamers and creators.
Turing: combines real-time ray tracing, artificial intelligence, emulation, and rasterization to revolutionize computer graphics.
Volta: NVIDIA Volta is the next big thing in AI. Volta is the most efficient GPU architecture, with over 21 billion transistors.
Industry Technologies
AI computing: enables every industry to find the highest intelligence in big data to solve the most challenging problems.
Deep Learning: neural networks master several degrees of complexity in this branch of machine learning.
Healthcare Advances
Imaging: Clara Imaging is a software framework that allows researchers to increase data annotation speed and build domain-specialized AI models.
Genomics: Using HPC to accelerate genomic analysis in population and cancer genomic studies.
Smart Hospitals: Clara Guardian is a technology platform that combines intelligent video analytics and conversational AI technologies with smart sensor implementation for protective mask identification and remote patient tracking in healthcare.(2)
NVIDIA posted revenue of $3.87 billion in the second quarter of 2020, up 50% from the same timeframe of 2019. Nvidia Corporation revenues increased 45% to $6.95B. The Computing and Networking segment increased from $694 M to $2.96B. The US segment increased from $353M to $1.44B, and The China segment increased by 42% to $1.61B. According to the company’s financial chief, Colette Kress, the effects of the pandemic will “likely reflect this evolution in enterprise workforce trends.”
Federal Learning and its Implications to Mammography Models
A massive amount of data is required to create applicable clinical deep learning models.Federal learning (FL) proposes sharing models by data weight, eliminating the unanalyzed data. To investigate the accuracy of FL, a study was conducted to develop a breast density classification model using mammography data. This study aimed to train the model without needing to share any data. Results revealed that FL could improve the models and increase their generalizability by adding data from other sources, such as test data from other clients.(4)
NVIDIA posted revenue of $3.87 billion in the second quarter of 2020, up 50% from the same timeframe of 2019. Nvidia Corporation revenues increased 45% to $6.95B. The Computing and Networking segment increased from $694 M to $2.96B. The US segment increased from $353M to $1.44B, and The China segment increased by 42% to $1.61B. According to the company’s financial chief, Colette Kress, the effects of the pandemic will “likely reflect this evolution in enterprise workforce trends.”
Federal Learning and its Implications to Mammography Models
A massive amount of data is required to create applicable clinical deep learning models.Federal learning (FL) proposes sharing models by data weight, eliminating the unanalyzed data. To investigate the accuracy of FL, a study was conducted to develop a breast density classification model using mammography data. This study aimed to train the model without needing to share any data. Results revealed that FL could improve the models and increase their generalizability by adding data from other sources, such as test data from other clients.(4)
Radiology and Deep Machine Learning
Radiology image reporting is undergoing many technological advancements; however, the assistance of AI to aid medical professionals in reviewing radiographs is a tool that is only as good as the training it receives.
Upon completing this study, the inclusion of natural language on the radiographs resulted in a lower accuracy rate than solely images. Further, historical chest x-rays with any abnormality data/images were bolstered with excellent correctness and accuracy. Differences in accuracy were measured through a novel metric, and new methods for maximizing the potential for correctness and accuracy in radiology were further promised. (5)
AI Helps Physicians to Exclude Coronary Atherosclerosis as the Cause of Chest Pain in an ED Chest-pain
Medical advancements in layering AI for Coronary Computed Tomography Angiography (CCTA) in patients with chest pain were demonstrated, with over 95% of cases being presented in two trials with more than 600 patients evaluated. Using an algorithm and computer learning, the advanced tactic assists physicians when evaluating patients with an absence of coronary artery atherosclerosis on CCTA in chest-pain presentations.(6)
AI for the Detection of COVID-19 on Chest CT Using Multinational Datasets
The use of AI for diagnosing COVID-19 could provide a precise, automated, and reproducible system for the classification and quantification of the disease. This study developed a Deep Learning (DL) system using Chest CT scans from a heterogeneous, globally diverse, and multi-institutional dataset. In total, 2724 scans were used (1387 scans for algorithm development and 1337 scans for testing and evaluation), with a prevalence of 24.4% (326/1337).
The results showed that the system could achieve up to 90.8% accuracy, 84% sensitivity, and 93% specificity. The false-positive rate was 10%. Although the method may not be used in COVID-19 screening, this DL solution may be helpful to medical professionals as a support or supplementary tool.(7)
AI for the Detection of COVID-19 on Chest CT Using Multinational Datasets
The use of AI for diagnosing COVID-19 could provide a precise, automated, and reproducible system for the classification and quantification of the disease. This study developed a Deep Learning (DL) system using Chest CT scans from a heterogeneous, globally diverse, and multi-institutional dataset. In total, 2724 scans were used (1387 scans for algorithm development and 1337 scans for testing and evaluation), with a prevalence of 24.4% (326/1337).
The results showed that the system could achieve up to 90.8% accuracy, 84% sensitivity, and 93% specificity. The false-positive rate was 10%. Although the method may not be used in COVID-19 screening, this DL solution may be helpful to medical professionals as a support or supplementary tool.(7)
Clinical Development and Validation of a Radiology AI System for COVID-19
From Massachusetts, a new end-to-end AI system has been proposed, using hospital architecture for COVID-19. The software was designed, based upon radiology infrastructures, that receives real-time clinical studies and ran a model to perform inferences on chest radiographs of patients suspected or confirmed with COVID-19. This type of technology enables physicians and researchers to determine the severity of patients with COVID-19 and provide better patient care. (8)
Can AI Predict the Need for Oxygen Therapy in COVID-19?
A group of researchers from Japan developed an AI capable of predicting the need for oxygen therapy in the early stages of COVID-19. The study analyzed 194 PCR confirmed COVID-19 patients. The group created three machine learning models: The first employed clinical features (including age, weight, height, past medical history, and laboratory findings), the second employed CT images, and the third employed both clinical features and CT images.
The AI model that employed a combination of clinical metadata with CT images predicted the necessity for oxygen therapy with the greatest accuracy (95%), showing the potential of this type of system, especially in triage. (9)
Can AI Predict the Need for Oxygen Therapy in COVID-19?
A group of researchers from Japan developed an AI capable of predicting the need for oxygen therapy in the early stages of COVID-19. The study analyzed 194 PCR confirmed COVID-19 patients. The group created three machine learning models: The first employed clinical features (including age, weight, height, past medical history, and laboratory findings), the second employed CT images, and the third employed both clinical features and CT images.
The AI model that employed a combination of clinical metadata with CT images predicted the necessity for oxygen therapy with the greatest accuracy (95%), showing the potential of this type of system, especially in triage. (9)