The Pentagon wants to know who can make graphene at scale — and it started asking before anyone had published a peer-reviewed method for how to do it. DARPA, the Defense Department's research arm, published a formal request for information on April 16 asking industry for data on graphene manufacturing capabilities, cost, and repeatability, according to HigherGov. The RFI specifically flags aerospace load-bearing structures: whether graphene sheets can be made large enough, and whether multiple sheets can be joined without degrading performance, according to GrapheneUses.org. Responses are due June 1.
Eight days later, the journal Small published a peer-reviewed paper from the Birmingham Centre for Mechanochemistry describing a production method the Pentagon is now asking about. The team's technique uses high-intensity vibration at room temperature with no toxic solvents, exfoliating layered two-dimensional materials at ten times the rate of conventional methods, and producing few-layer graphene without introducing the atomic defects that compromise the material's properties. It also works on hexagonal boron nitride (h-BN, an electrical insulator), molybdenum disulfide (MoS₂, a semiconductor), and tungsten disulfide (WS₂, another semiconductor), according to the Birmingham Research Portal, citing the University of Birmingham press release.
DARPA's RFI is market intelligence gathering before a formal solicitation, as GrapheneUses.org reported. The Birmingham paper is evidence that a real production method now exists. Whether this specific method scales from gram-yields to tonne-scale production is the open question. The gap between laboratory demonstration and factory reality is where most graphene ventures die. But the Pentagon is now explicitly asking who can close that gap.
The Birmingham team, led by Dr. Jason Stafford, shakes layered precursor materials in a liquid suspension at accelerations up to 100 times standard gravity until the atomic layers separate cleanly, per the ChemRxiv preprint. The process takes minutes; the first precursor folds appeared at five minutes. Stafford's group has filed patents on mechanochemical processing through University of Birmingham Enterprise; he is listed as co-inventor on twenty patents, according to the Birmingham Research Portal. The team describes the method as opening "more sustainable synthetic routes" and lowering "the barrier for industrial translation." That is accurate. It is also a narrower claim than "solves graphene manufacturing."
Graphene, a sheet of carbon atoms in a hexagonal lattice, has been the subject of extraordinary research claims since its isolation in 2004. The material is genuinely remarkable: near-perfect electrical conductivity, mechanical strength hundreds of times that of steel, thermal performance useful across electronics, batteries, and composites. The problem is that making it at scale, consistently, and without introducing defects has been an unsolved engineering challenge for twenty years. Defect-free graphene and defective graphene can perform differently by orders of magnitude, according to a review in the journal Carbon. The difference between a lab curiosity and an industrial material lives in the manufacturing method, and the method is the hard part.
GMG, an Australian company, is building a 10-tonne graphene plant and hired a Rio Tinto veteran to run production, according to GrapheneUses.org. The same company announced April 15 that the energy density of its graphene aluminum-ion battery cells had doubled to 49 watt-hours per kilogram, up from 26 watt-hours per kilogram in December, the site reported. That is a real product, not a roadmap claim. GMG says the cells can fully charge in six minutes. What GMG has not disclosed publicly is which manufacturing method it uses. For the Pentagon's purposes, that supply chain opacity is exactly the problem the RFI is trying to solve. Whether Birmingham's approach becomes the basis for a domestic supply chain or joins the long list of graphene methods that worked in the lab and not elsewhere depends on data that will not exist for another two to three years of engineering work.