Q&A with Dr Andrew Pollard on standardising graphene

Materials World magazine
,
1 May 2016

Dr Andrew Pollard is leading the National Physical Laboratory's efforts to standardise and characterise graphene. He talked to James Perkins.

When will we start seeing graphene standards?

When going through the International Organisation for Standardisation (ISO) or International Electrotechnical Commission (IEC), you are usually looking at a minimum of two or three years to develop an international standard. At the moment, there are several graphene standards that are going through ISO or IEC and they are at different stages, so they will be addressed over the next few years. There are standards that are being evaluated now that will take two or three years to publish. Others have been going for a year and you would hope would be coming out in a couple of years' time, so it will be an ongoing process.

One of the first things that we wanted to address was the terminology itself. We found that industry was asking 'what is graphene?' The line between graphene and graphite has been blurred. Since graphene was first isolated, there are terms such as bi-layer and multi-layer graphene. One of the key things about standards is that you take into account what is actually is going on in the market right now – if people are already using terms you have to try and standardise them rather than trying to force new terms on an industry that is already talking in a certain way. One of the first standards to come out will hopefully be this terminology document that is going through ISO and IEC. 

How does the process work?

The terminology standard is being constantly developed at the moment. Each country has its own standards body, which in the case of the UK is the British Standards Institution (BSI). ISO ask these standards bodies to put forward experts for each standard that is balloted within an ISO committee. Depending on which countries are interested in that standard, you might have quite a few countries with one or more experts that they nominate, so you end up having a substantial group developing one standard. There is a lot of back and forth and it is an evolving document. 

Are defects in graphene always a bad thing?

When you think ‘defect’ you generally think negatively, but for something like energy storage you might want very defective, smaller flakes, with higher oxygen content. For electronic applications you might want as large a sheet as possible, with as few defects as possible and no oxygen content. So defects are not necessarily negative – the word 'quality' is used a lot in the community, but the quality is relative to the application, which means it is even more important that you can characterise the level of defects.

Can we overcome quality control issues for graphene?

There isn't a standardised technique for large-scale throughput and quality control. These are the things that will need to be developed, so it is something we are interested in at the National Physical Laboratory (NPL). You need techniques that are easy to use, or at least in some way easy to automate, because you don't want to have a technique that requires a professor to analyse the results, you want technician-level understanding. It is definitely a hurdle that needs to be overcome. There is no point in reinventing the wheel – you can use other standards that are published in other areas of manufacturing to apply quality control, so you don't have to test every single flake that comes off of a conveyor belt. Standardisation of the terminology is generally required first, before high-level processing and quality control.

You’ve talked about Raman spectroscopy being the 'go-to technique' for graphene analysis, and you are using a tip-enhanced version. How are you using this technique at NPL?

Raman spectroscopy is one of the tools that is very commonly used for characterising graphene. It is generally non-destructive and is a rapid technique compared with others. But Raman spectroscopy has limitations in terms of lateral resolution. This doesn't mean it isn't necessarily suitable for quality control. But when you start dealing with graphene flakes that are smaller than the typical probe size you have with confocal Raman spectroscopy, you can't measure the basal plane of the flakes without measuring the edges, which are also measured as defects. Tip-enhanced Raman spectroscopy essentially takes scanning probe microscopy and Raman spectroscopy and combines them, providing an improved lateral resolution. You can achieve 10–20nm lateral resolution, which means you can then look at individual flakes and understand the defect level in those flakes.

Tip-enhanced Raman spectroscopy isn't going to be used as a quality control technique for graphene powders, as it would be too slow. However, it does further our understanding of Raman spectroscopy – you can look at these flakes and how defective they are with nanoscale resolution and compare that to the measurements you would get from conventional confocal Raman spectroscopy. If we improve our understanding of conventional Raman spectroscopy it may well be used for future quality control processes.

Can we overcome the characterisation and quality control challenges?

It is going to be a difficult obstacle to overcome, but one of the amazing things about the graphene field is how quickly everything is progressing. There are so many researchers around the world and so much money invested in this area that there have been unbelievable advances in the last few years. This is partly why we currently have a lack of standards. You don't want to try to standardise an area when there isn’t enough knowledge because the only thing worse than no standards is a bad standard. That is why international consensus is also very important. 

Is there anything else holding the process back?

A problem is that standardisation essentially requires the time of volunteers, so a lot of the time it is difficult to get the experts that you need involved in the standardisation process. The people you need to do it are working in companies and are very busy trying to make products, or they are academics who are very busy finding new and exciting research to do. Sadly, these people know standardisation is important but they have other priorities. Sometimes it means the standardisation process might take longer than everyone would like.

There is time, though. It is around 10 years since graphene was first isolated, and new technologies can take 20 years or more to be implemented in a real product. For graphene, there are many different application areas predicted and a lot of different ways of making graphene, which complicates the issue.

To find out more about Andrew's work at NPL, visit www.npl.co.uk

Andrew Pollard received his MSci in Physics from the University of Nottingham in 2005 and finished his PhD there in 2010. His previous work includes development of a combined AFM-STM system in UHV, production of graphene monolayers on transition metal surfaces in UHV and subsequent STM surface studies.

Since joining NPL in 2009, he has led the Surface and Nanoanalysis Group's research into the measurement of graphene, other graphene-like advanced 2D materials, and associated devices. He has also engaged with organisations in the emerging graphene industry, advising on aspects of standards and measurement in this area, and is on the advisory board of the Graphene Stakeholders Association (GSA).