By: Gerald E. Finken
Technology changes healthcare. From the earliest vaccines and antibiotics to using artificial intelligence to predict the antibacterial function of a molecular structure and the development of CRISPR gene editing, there can be no doubt that technology has always, and will continue to always play a pivotal role in the healthcare fields of diagnostics, treatment, and many others.
Despite all of these areas in which technological advancements and breakthroughs have improved healthcare, however, a confounding issue arises in the field of clinical research. In 1980, according to the Tufts Center for the Study of Drug Development, a new drug took seven to fourteen years to develop, and nine out of ten of those drugs failed to get FDA approval upon first-time submission.
Each of those drugs, successful or not, required an estimated development cost of $500 million. Forty years later, it still takes seven to fourteen years to develop a drug, and nine out of ten first-time submissions to the FDA still fail. Now, however, each of those drugs takes an estimated $2.6 billion to develop, whether successful or not.
Read the entire article on Coruzant Technologies Magazine