Half-random ranty post that might develop into something more structured at some point… Feedback very much welcome.
Andrew Maynard has blogged about the extent to which novelty should (or, in fact, should not) be the main consideration for the evaluation of nanomaterials risks (initially published as an editorial in Nature Nanotechnology). It’s entitled “Is novelty in nanomaterials overrated when it comes to risks” and is well worth reading in full. A central point is that:
Novelty as a result is a subjective, transient, and consequently a rather unreliable indicator of potential risk. It tends to obscure the reality that conventional behaviour can sometimes lead to harm, and that mundane risks are still risks. And it favours the interesting (and possibly the headline-grabbing) over the important. But if novelty is an unreliable guide to potential risk, how can approaches be developed that help identify, understand and manage plausible risks associated with emerging materials and the products that use them?
Apparently unrelated (but wait for the next paragraphs), there are various initiatives to encourage or even mandate sharing of data related to the characterization of (nano)materials. It is thought that this will boost innovation and facilitate the coming together of computational and experimental work. Maybe the most impressive and concerted effort comes from the White House Office for Science and Technology as exemplified by this post It’s Time to Open Materials Science Data. Publishers have smelled something and are moving to the area of providing services for data sharing and curation; NPG launched Scientific Data in partnership with FigShare; Elsevier has just launched an initiative specifically targeted to open data in Materials Science.
Now for the (arguably subtle and tenuous) link. Novelty is overrated not just when it comes to risk. It is overrated in materials science full stop. This seems not intuitive; surely scientific endeavour in materials science is about discovering new materials. The problem here (and arguably the opportunity too) is that there is an immense combinatorial space of potential new materials. We work on peptide-capped gold nanoparticles. By varying the peptide sequences and making various mixed monolayers, we can potentially generate hundreds of novel materials every day (and we do make a fair number). The combinatorial space of potential nanomaterials vastly exceed the number of potential molecules. Most of these materials are not interesting, but they are novel: nobody made them before.
I see a lot of research articles which can be summarised as
- This is a novel nanomaterial (and it truly is: nobody has made before this gold-nanorod-with-carbon-dots-at-the-tips-graphene-oxide-on-the-side-and-some-antibody-labelled-conductive-polymer-wrapped-around )
- It could be used for [delete as appropriate] energy/biological imaging/curing cancer (and it will never be).
When it comes to safety, Andrew argues convincingly that the focus should be on plausible scenarios rather than on novelty. When it comes to what should be curiosity-driven science, there seems to be a lot of new materials generated for the sole purpose of highly improbable applications rather than in the pursuit of general principles that would help us explore the materials landscape. This has the very unfortunate consequence that the materials characterisation is often poor and limited to whatever is thought to enable the envisioned application. An extremely large proportion of these new materials are made by a single group for the purpose of a single paper. The experiments are not reproduced independently. Capturing all of this data into platforms that are open and suitable for data mining is a noble and worthwhile purpose which I support, but it must be accompanied by a change of focus and higher standards of characterisation otherwise I fear that it will not help understanding much.
 Novel Nano-Lychees for Theranostics of Cancer; Charles Spencer and Edna Purviance; Nature Matters-to-all (2015) 7 101-114