APAC CIOOutlook

Advertise

with us

  • Technologies
      • Artificial Intelligence
      • Big Data
      • Blockchain
      • Cloud
      • Digital Transformation
      • Internet of Things
      • Low Code No Code
      • MarTech
      • Mobile Application
      • Security
      • Software Testing
      • Wireless
  • Industries
      • E-Commerce
      • Education
      • Logistics
      • Retail
      • Supply Chain
      • Travel and Hospitality
  • Platforms
      • Microsoft
      • Salesforce
      • SAP
  • Solutions
      • Business Intelligence
      • Cognitive
      • Contact Center
      • CRM
      • Cyber Security
      • Data Center
      • Gamification
      • Procurement
      • Smart City
      • Workflow
  • Home
  • CXO Insights
  • CIO Views
  • Vendors
  • News
  • Conferences
  • Whitepapers
  • Newsletter
  • Awards
Apac
  • Artificial Intelligence

    Big Data

    Blockchain

    Cloud

    Digital Transformation

    Internet of Things

    Low Code No Code

    MarTech

    Mobile Application

    Security

    Software Testing

    Wireless

  • E-Commerce

    Education

    Logistics

    Retail

    Supply Chain

    Travel and Hospitality

  • Microsoft

    Salesforce

    SAP

  • Business Intelligence

    Cognitive

    Contact Center

    CRM

    Cyber Security

    Data Center

    Gamification

    Procurement

    Smart City

    Workflow

Menu
    • Cyber Security
    • Hotel Management
    • Workflow
    • E-Commerce
    • Business Intelligence
    • MORE
    #

    Apac CIOOutlook Weekly Brief

    ×

    Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Apac CIOOutlook

    Subscribe

    loading

    THANK YOU FOR SUBSCRIBING

    How an AI-Powered Chip Predict a Drug's Behavior on a Tumor

    In the last few years, Artificial Intelligence (AI) has made keen developments in research fields.  

    How an AI-Powered Chip Predict a Drug's Behavior on a Tumor

    By

    Apac CIOOutlook | Friday, December 27, 2019

    Stay ahead of the industry with exclusive feature stories on the top companies, expert insights and the latest news delivered straight to your inbox. Subscribe today.

    In the last few years, Artificial Intelligence (AI) has made keen developments in research fields. Deep-learning algorithms stand out at quickly finding patterns in reams of data that have sped up vital processes in scientific discovery. Currently, along with software enhancements, a hardware revolution is also on the sphere.

    Recently, Argonne National Laboratory, a science and engineering research national lab, announced that it has begun to test a new computer from the startup Cerebras that promises to speed up the training of deep-learning algorithms by orders of magnitude. The computer that houses the world’s largest chip is part of a new generation of specialized AI hardware that is only now being put to use.

    At present, the most widespread chips used in deep learning are identified as graphical processing units or GPUs. GPUs are parallel processors and before their implementation by the AI world, they were extensively used for graphic and game production. By coincidence, the same characteristics that permit them to render pixels swiftly are also the ones that make them the favored choice for deep learning. But mainly, GPUs are general-purpose; although they have successfully powered the decade’s AI revolution, their designs are not advanced for the task. In response, businesses have raced to design new chip architectures that are particularly suited for AI. The chips have the potential to train deep-learning models up to a thousand times faster than GPUs, with far less energy id done well. Cerebras is among the long list of companies that have since leaped to benefit from the opportunity.

    A successful new AI chip is required to meet quite a few criteria. At a minimum, the chip has to be 10 or 100 times faster than the general-purpose processors when functioning with the lab’s AI models. Most of the specialized chips are enhanced for commercial deep-learning applications such as computer vision and language, but cannot perform as well when handling the types of data common in scientific research. They have a lot of higher-dimensional data sets— sets that intertwine together enormous disparate data sources and are far more intricate to process than a two-dimensional image. The chip must also be easy to use and reliable because there are thousands of people doing deep learning at the lab, and not everybody is a swift programmer. So, the chip is expected to get work done quickly without having to spend time learning something news on the coding part.

    So far, Cerebras’s computer has checked all the boxes. Credits to its chip dimension, it is bigger than an iPad and has over trillion transistors for performing calculations and does not require hooking multiple smaller processors together that can slow down model training. In testing, the chip has already reduced the training time of models from weeks to hours. The lab wants to be able to train the models fast enough, so the scientist who is doing the training still remembers what the question was when they started.

    The laboratory has been testing the computer on its cancer drug study, whose objective is to expand a deep-learning model that can predict how a tumor may respond to a drug or combination of drugs. The model can then be employed in one of two ways; to develop new drug candidates that can have desired consequences on a particular tumor or to predict the outcome of a single drug candidate on several different types of tumors. The lab expects Cerebras’s system to considerably speed up both growth and deployment of the cancer drug model that could involve training the model hundreds of times and then running it billions more to make predictions on each drug candidate.

    Additionally, the scientists also hope the model will boost the lab’s research on other topics like traumatic brain injury and battery materials. The former would entail developing a model to foresee the paramount treatment options. It is a difficult task because it needs processing many types of data like biomarkers, brain images, and text quickly. The latter work would involve designing an AI model for predicting the properties of millions of molecular arrangements to find substitutes for lithium-ion chemistry. Ultimately the team is keyed up by the potential that the combination of AI hardware and software advancements will bring to the scientific exploration.

    More in News

    Impact of Digital Transformation on Retail

    Impact of Digital Transformation on Retail

    AI's Role in Apac's Digital Transformation Journey

    AI's Role in Apac's Digital Transformation Journey

    Salesforce Services in APAC: Empowering Digital Transformation Across the Region

    Salesforce Services in APAC: Empowering Digital Transformation Across the Region

    Enhancing Customer Satisfaction through Omni-Channel Payments

    Enhancing Customer Satisfaction through Omni-Channel Payments

    I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info

    Copyright © 2025 APAC CIOOutlook. All rights reserved. Registration on or use of this site constitutes acceptance of our Terms of Use and Privacy and Anti Spam Policy 

    Home |  CXO Insights |   Whitepapers |   Subscribe |   Conferences |   Sitemaps |   About us |   Advertise with us |   Editorial Policy |   Feedback Policy |  

    follow on linkedinfollow on twitter follow on rss
    This content is copyright protected

    However, if you would like to share the information in this article, you may use the link below:

    https://www.apacciooutlook.com/news/how-an-aipowered-chip-predict-a-drug-s-behavior-on-a-tumor-nwid-7225.html