While the term biobanking has emerged in the 1990s, it has yet to acquire a clear meaning. For some, the term encompasses the long-term storage obligations that biorepositories provide, and for others, refers to the administrative work and scientific projects that biobanking supports. Perhaps the word is in flux because biobanking itself is in flux.
Biobanking initially became a hobby. It was simple, but individual laboratories would do it with ordinary freezers and handwritten notes. More recently, biobanking began projecting a large scientific manner. For example, ambitious biobank projects such as the UK Biobank and the All of Us Research Program were launched with the assistance of government agencies. (Supporters of the UK Biobank are the UK Department of Health, the Medical Research Council, and the Scottish Executive. The All of Us Research Program is led by the National Institutes of
Biobanking is a labor-intensive business; it is a labor-intensive business; and it also has a certification process. As this article highlights, biomarkers are developing to reduce biospecimen quality, and the development of online communities to assist researchers in establishing connections; and the promulgation of standards for validation and accreditation.
Mark Bagnall, CEO of Phenomix Sciences, believes that the individual variability in response to obesity treatment has been established for decades and is a significant therapeutic challenge, but it is not properly understood.
Phenomixs technology is inspired by research from its physician founders, Andres Acosta, MD, PhD, and Michael Camilleri, MD, of the Mayo Clinic. They examined almost 1,000 patients in multiple clinical trials and revealed that obesity comprises a wide spectrum of illnesses. At least four of these cases account for 90% of obesity patients, according to Bagnall. We built Phenomix based on that knowledge.
After Acosta and Camilleri founded Phenomix 2017, the company leveraged the 1,000-patient study to establish the Phenomix Sciences Obesity Platform. This database contains about 20 billion discrete data points. Using these data, Phenomix scientists are working to identify the subtype, or phenotype, of obesity in an individual and guide the treatment's selection.
According to Bagnall, this is the foundation for precision medicine for obesity. This is the reason for our biobanking study.
Researchers at Phenomix have added results from 2,000 participants that will undergo obesity treatment in the next five years, and they will provide baseline and regular intervals to establish a longitudinal database. Biological data that originate from DNA, hormones, metabolites, and stool samples, together with behavioral information such as exercise and eating behaviors, are integrated into a specialty app that participants can access and from where they will receive individual feedback.
This will increase our understanding of the disease, according to Bagnall. We will take a look at obesity phenotyping in a way that has not been previously discussed. Along with the initial research in collaboration with the Mayo Clinic, Phenomix is in discussion with other biobanking sites in the United States. The first test developed by Phenomix will identify individuals who respond to glucagon-like peptide-1 receptor antagonists, such as semaglutide.
According to Bagnall, a primary care physician might recommend a test to determine the phenotype if someone is overweight or obese. The test would indicate that an intervention will be most likely to produce the desired response.
Nearly 60% of Americans currently have overweight or obese. The associated mortality is astounding, and the obesity epidemic consumes a huge portion of healthcare expenses, according to Bagnall. We will never get healthcare costs under control unless we control obesity.
Chad Borges, a researcher at Arizona State University with joint positions in the Biodesign Institute and the School of Molecular Sciences, believes his desire in biobanking is to provide quality control and assessment tools. Besides, we need better tools to ensure that samples are preserved, especially if they have been banked for a while and/or changed owners.
Assays, which might equidate how long, in aggregate, biological samples have spent above their storage temperatures, are required because freeze-thaw histories are rare.
Borges and his colleagues developed the S-Cys-Albumin test to quantify the cumulative exposure of plasma and serum to thawed conditions. That is, the S-Cys-Albumin test demonstrates the relative abundance of an albumin proteoform called S-cysteinylated albumin. That is, the S-Cys-Albumin test shows how much albumin in a sample has been oxidized via S-cysteinylation, an method that
Borges and colleagues presented findings in blind challenge surveys that, using the S-Cys-Albumin biomarker, they can predict with a 98 percent accuracy whether a sample is clean or has been exposed to dry conditions. In addition, the instability of biomolecules with known degradation profiles, such as matrix metalloproteinase-9 and a2-macroglobulin, may be linked to changes in S-Cys-Albumin, sample by sample.
Many protein molecules or analytes of clinical interest are unavailable for stability information. Borges and colleagues used a liquid chromatographymass spectrometry technique to measure the stability of S-cysteinylated albumin and 21 clinically relevant proteins at room temperature, in refrigerated conditions, and in freezing conditions.
Regardless of the destabilizing temperature or the time, the researchers determined that there was a linear inverse connection between the percentage of proteins destabilized and S-Cys-Albumin. Many of these instabilities were observed at every single temperature, according to Borges.
Even if information about an archive sample freeze-thaw history is unavailable, according to the researchers. I would like to see the biobanking field at a location where every biobank has the tools it requires to assess the integrity of its samples.
Instead of developing another commercial biobank, we created a platform to connect two players, according to Christopher Ianelli, the founder and CEO of iSpecimen. On the other side of the platform, there is a network of providers across the healthcare landscape, such as hospitals and physician practice groups, who have access to patients and clinical samples. On the other side of the platform, there is a network of life sciences researchers.
The iSpecimen Marketplace is a software development platform that allows researchers to search exact patient samples and datasets that they need, according to Ianelli. In other words, it helps establish a connection between the life sciences researchers who require a specimen and the providers who have it.
iSpecimen claims that biospecimen procurement procedures are ethical; data exchanges are secure. As an example, Ianelli states, all data we receive is stripped of patient identifications. In addition, the data sets are monitored and cleaned so that everyone may be mapped to a standard uniform database.
So far, over 200 healthcare providers and 400 customer relationships have entered the iSpecimen Marketplace. We have completed over 2,000 research projects by acquiring the appropriate specimens to research programs, according to Ianelli.
iSpecimen does not charge hospitals or healthcare providers, and its income is derived from the treatment of commercial researchers. This approach assists in the life sciences sector, where many subdisciplines have faced a similar bottleneck, namely, reaching biospecimens from a very specific group of patients. We open the bottleneck by allowing researchers access to our prebuilt network.
A singular challenge that iSpecimen scientists are currently grappling is the digitization of healthcare. It has existed in finance for years, but it is relatively new for hospitals and healthcare providers. Interoperability is the ability of two or more hospitals to exchange information. These issues have been hugely reduced, according to Ianelli, but progress has been made in recent years to improve it.
Biobanks must assess the value of automation over its cost, because this would provide protection during sample handling and storage, according to Neil Benn, co-founder and managing director of Ziath. The company is attempting to develop sample tracking and management solutions in the market. It has four main product categories: 2D barcoded tubes, devices for handling tubes, 2D barcode scanners, and software.
Although many biobanks are unable to upgrade their techniques, Ziath believes that frequent testing is taking hold, unless automation is continuing. Benn advises that specialized biobanks who still utilize Excel to track their samples and labeling sticky labels on their tubes are often confronted with misidentified, lost, or corrupted samples and incorrect data. In our research, the United States is currently lag behind Europe and Asia in this regard.
Ziath is regarded as a global leader in the manufacture of 2D barcoded sample tubes to keep large sample libraries. Benn says the only convenient, reliable and efficient way to keep high-volume samples is by employing tubes that have had a 2D data matrix removed.
Although 2D datamatrix bars provide an excellent solution for tracking, they may be difficult to comprehend. Using machine learning and artificial intelligence to help users locate and decode barcodes more quickly, even when they are iced over. We anticipate new digital technologies will enter this field, according to Benn.
According to Cory Arant, the program manager of the American Association for Laboratory Accreditation (A2LA), I'd like to see additional techniques that might be utilized in processing or storage of materials. He believes that there should be some format that can be shared so that organizations that are familiar with validation methods might be a tool for them.
Arant discussed best practices and standards in biobanking, including nonconformities and corrective actions when participating in the A2LA's 2022 meeting.
Many methods are available to the public in the testing field, and many organizations are implementing strong and standardization approaches. Nevertheless, accredited organizations generally employ their own internal procedures. Usually, these are validated procedures. Generally, they are probably very similar, according to Arant.
If accreditation and validation are lacking, researchers are less likely to waste valuable time on achieving irreproducible results. As Arant notes, researchers may fail to keep (or discard) information in accordance with widely used standards. He adds that when researchers utilize the material they receive, they may make preanalytical errors.
Arant is also interested in addressing the need to distribute materials over the international borders. If someone attempts to get something imported, it may stay in customs for an extended period of time, according to Arant. The material may be compromised.