We speak to ITProPortal about advances in pharmaceuticals and faster data to pursue a cancer cure.
Many people have been lost to cancer, and everybody can potentially get one form of cancer or another. The race is therefore on to find a cure. Increasingly, pharmaceutical companies and organisations conducting research to push cancer into history realise that the answer may lie in big data and in sharing it. To achieve this they need reliable network infrastructure that isn’t slowed down by the data and network latency. Even with the ever growing volumes of big data, it’s important to allow data to flow fast to permit accurate analysis.
On 27th April 2017, The Telegraph newspaper wrote: ‘How data is helping to select the best new cancer treatments for patients’. The article, which is sponsored by Cancer Research UK, reveals: “Gifts left in wills help Dr. Bissan Al-Azikani and her team to look at vast amounts of data, which can aid them in identifying the best new cancer treatments for patients.
Talking about her role, she says: “In the past 10 years, we have made huge strides in discovering therapies as a result of understanding cancer at a molecular level. The challenge is to decide which cancer drug targets show the most promise and they should be prioritised for development. We use artificial intelligence to select the best drug targets by combining DNA data from tens of thousands of patients with billions of results from lab experiments.”
Privacy and transparency
Yet there is one crucial hurdles that the researchers need to overcome: The data is not readily available, and so people need to be encouraged to share their biological data. Even in this field data privacy is important and people will want to know that the data is being used for a good purpose.
“You must then convince the medical centres and genetic companies who collect this data to offer open access. Hoarding it with their own profitability in mind won’t help anyone to find a cure for cancer. Transparency is crucial, and by sharing it on an open access basis economies of scale can be attained and the data sets will number in their millions. Unfortunately, Dupre points out that the “volume of information is simply not available, but companies ranging from tech behemoths to biomedical start-ups are racing to solve these issues of scale.”
With the right digital infrastructure and informed data-sharing consent in place anything is possible. Not everyone, but many more patients may become happier in the future to share everything from the genome data to blood pressure data. With increasingly patient-friendly tests it will become possible to check each individual for signs of trouble and to act quickly. However, with the need to examine exabytes of big data, and to invest in data acceleration solutions will be a must. WAN optimisation solutions and the man in the van just won’t do.
Yet IT alone won’t cure cancer. It requires a multi-disciplinary approach and a roadmap. The Association of the British Pharmaceutical Industry (ABPI) in association with management consultancy KPMG has therefore published a ‘Big data roadmap’. The document examines the big data challenges with a call to action, a look at how the UK competes, and an analysis of the future big data opportunities for the pharmaceutical industry.
It concludes by saying: “We need to continue to scan the big data horizon, identifying new opportunities where innovative technologies may play an emerging role. This will ensure focused investment and contribute to a sustainable cycle – identifying and prioritising next generation, high value opportunities.” To create value from big data there needs to be open dialogue, and the report emphasises that collaboration is key between all of the stakeholders – say, for example, in cancer research and the use of big data to find a cancer cure. But this will amount to nothing unless the right technology is put in place to ensure that data can flow freely and fast across a WAN.