New algorithm could replace chemical toxicity testing in animals

Press Trust of India  |  New York 

Using animals to test the of may one day become outdated, thanks to scientists who have developed a low-cost, to test the safety of drugs.

However, of the 85,000 compounds used in consumer products, the majority have not been comprehensively tested for safety.

Animal testing, in addition to its ethical concerns, can be too costly and time consuming to meet this need, according to the study published in the journal Environmental Perspectives.

"There is an urgent, worldwide need for an accurate, cost-effective and rapid way to test the of chemicals, in order to ensure the safety of the people who work with them and of the environments in which they are used," said Daniel Russo, a doctoral candidate at the Rutgers.

"alone cannot meet this need," Russo said in a statement.

Previous efforts to solve this problem used computers to compare with structurally similar compounds whose toxicity is already known.

However, those methods were unable to assess structurally unique -- and were confounded by the fact that some structurally have very different levels of toxicity.

The researchers overcame these challenges by developing a first-of-its-kind algorithm that automatically extracts data from PubChem, a database of information on millions of chemicals.

The algorithm compares fragments from tested compounds with those of untested compounds, and uses multiple mathematical methods to evaluate their similarities and differences in order to predict an untested chemical's toxicity.

The algorithm mines massive amounts of data, and discerns relationships between fragments of compounds from different classes, exponentially faster than a human could, said Lauren Aleksunes, an at Rutgers.

"This model is efficient and provides companies and regulators with a tool to prioritize chemicals that may need more comprehensive testing in animals before use in commerce," said Aleksunes.

To fine-tune the algorithm, the researchers began with 7,385 compounds for which toxicity data is known, and compared it with data on the same chemicals in PubChem.

They then tested the algorithm with 600 new compounds. For several groups of chemicals, the had a 62 per cent to 100 percent success rate in predicting their level of oral toxicity.

By comparing relationships between sets of chemicals, they shed light on new factors that can determine the toxicity of a chemical.

Although the algorithm was directed only to assess the chemicals' level of toxicity when consumed orally, the researchers conclude that their strategy can be extended to predict other types of toxicity.

"While the complete replacement of is still not feasible, this model takes an important step toward meeting the needs of industry, in which new chemicals are constantly under development, and for environmental and ecological safety," said Hao Zhu, an at Rutgers.

(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)

First Published: Wed, April 17 2019. 17:50 IST