.

Tuesday, December 12, 2017

When chemists research new medicines, much of their work is testing hundreds of possible variables in a chemical formula in order to find the desires characteristics needed to treat a variety of illnesses.
This process is of experimentation and discovery often leads to a development time of more than 10 years before a new drug is brought to market-- often at a cost of billions of dollars. is done on computers that have to combine and recombine elements to test the results.
Like financial markets, all of these variables can be processed concurrently by a quantum computer and will greatly reduce the time and cost necessary to develop new drugs.

Supply Chain Logistics

Quantum Optimization
Source: 12
The importance of logistics has been well understood throughout history, from armies to merchants, but also to scientists and mathematicians.
Trying to coordinate and identify the most efficient route for goods to travel to market has been one of the most elusive goals for both business and science for just about forever, but never more so than today, when a business may have global supply chains to deal with.
This falls under a class of problems called optimization problems and generally, they cannot be solved using brute force algorithms, where permutations are calculated and compared one at a time.  Because qubits are superpositions though, they will apply any given operation to all possible values represented by the superposition. 
Rather than billions of trillions of individual operations, quantum computing can reduce the most difficult Optimization problems down to a number of operations where even a classical computer could find the optimal answer quickly.

Exponentially Faster Data Analysis

Quantum Data
Source: Pixabay
The explosion of the Internet, rapid advances in computing power, cloud computing, and our ability to store more data than was even considered possible only two decades ago has helped fuel the Big Data revolution of the 21st century, but the rate of data collection is growing faster than our ability to process and analyze it.
In fact, 90 percent of all data produced in human history was produced within the last two years.
As scientific instruments continue to advance and even more data is accumulated, researchers classical computing will be unable to process the growing backlog of data.
Fortunately, scientists at MIT partnered with Google to mathematically demonstrated the ways in which quantum computers, when paired with supervised machine learning, could achieve exponential increases in the speed of data categorization.
While only a theory now, once quantum computers scale sufficiently to process these data sets, this algorithm alone could process an unprecedented amount of data in record time.

No comments:

Post a Comment