IEEE University of Lahore

IEEE
//

Archive for the ‘Uncategorized’ Category

For Specialized Optimizing Machines, It’s All in the Connections

Friday, May 24th, 2019

Whether it’s an Ising machine or a quantum annealer, the machine seems to matter less than how the parts connect

Suppose you want to build a special machine, one that can plot a traveling salesperson’s route or schedule all the flights at an international airport. That is, the sorts of problems that are incredibly complex, with a staggering number of variables. For now, the best way to crunch the numbers for these optimization problems remains a powerful computer.

But research into developing analog optimizers—machines that manipulate physical components to determine the optimized solution—is providing insight into what is required to make them competitive with traditional computers.

To that end, a paper published today in Science Advances provides the first experimental evidence that high connectivity, or the ability for each physical component to directly interact with the others, is a vital component for these novel optimization machines. “Connectivity is very, very important, it’s not something one should ignore,” says Peter McMahon, a postdoctoral researcher at Stanford, who participated in the research.

I Built a Brain Computer Interface for Tackling ADHD in Children

Friday, May 24th, 2019

A playful headpiece called Agent Unicorn incorporates an EEG that detects P300, the neural signature of attention

I am a fashion-tech designer: I combine fashion with engineering, science, and interactive user-experience technologies. When worn, many of my designs monitor physiological indicators—such as heart rate—and react to this information in some way, communicating the wearer’s internal state. In 2016, I was in Linz, Austria, as an Ars Electronica Futurelab artist-in-residence, and I decided to take the opportunity to see if I could apply my techniques therapeutically.

The result was a headpiece that helps children with attention-deficit hyperactivity disorder (ADHD) and their caregivers better understand what environmental cues are associated with symptomatic problems. In April, the underlying technology became commercially available for makers interested in brain-computer interfaces (BCIs).

When I arrived in Linz, I was already interested in electroencephalography (EEG) devices, which measure the brain’s electrical activity, as I saw them as an extension of my earlier work. Through Ars Electronica I met Dominik Laister at the nearby Barmherzige Brüder Hospital, who became a valuable advisor. After consulting with him, I decided to focus on what is known as the P300 event-related brain potential signal. P300 is a frequent focus of clinical exams and BCI research. It is a voltage pulse, often thought to be connected to attention and decision making, that occurs a few hundred milliseconds after an external stimulus. The P300 signal is often measured when diagnosing children with ADHD because the signal takes longer to manifest and isn’t as strong as it is in children without ADHD.

ADHD is commonly treated with stimulants such as Adderall that can boost concentration and focus while reducing hyperactive and impulsive behaviors. But while medication might help soothe symptoms, it doesn’t help in understanding why a child with ADHD is more prone to react to certain stimuli, or how their symptoms might be treated in a way that reduces reliance on drugs.

My goal was to create a device that would provide the data needed for such insights by monitoring both the brains of children and their environments. The result was Agent Unicorn—a headpiece with a projecting horn. The horn contains an 8-megapixel camera that records video during states of heightened P300 activity, as detected by an EEG built into the headpiece itself. The headpiece has a shape that automatically positions the electrodes at the correct locations on the skull.

To create Agent Unicorn, I did not want to use any of the commercial EEG devices then available. The cheaper devices lacked good data-acquisition and processing capabilities—many of them used a single electrode that touched the forehead and struggled to distinguish between electrical activity caused by brain waves and activity caused by muscle contractions. Medical-grade systems were steeply priced—up to US $10,000 for a clunky, albeit reliable, device.

So I began collaborating with Christoph Guger, the founder and co-CEO of G.tec, a medical engineering company located near Linz. G.tec is a large producer of EEG equipment and software, mainly supplying hospitals and other medical clients. But Guger has a keen interest in the new generation of makers who want to use neurological data in their projects. Because I was specifically interested in the P300 signal, we were able to trim the number of electrodes down from the conventional 64 to eight. We developed a miniature EEG board that connects to electrodes that can be used without applying conductive gel.

In the headpiece, the EEG board is connected to a Raspberry Pi Zero W single-board computer. The Pi also receives a continual video feed from the horn camera. When a P300 event is detected, the Pi wirelessly sends a video clip incorporating the 3 seconds before the event and the 5 seconds after to a laptop computer, so that what captured the wearer’s attention can be reviewed.

By July 2016 I had a few headpieces with horns in a variety of shapes. In collaboration with 3D-printing specialist Igor Knezevic in Los Angeles, we researched an even wider range of horn shapes and colors, and then began trying Agent Unicorn out with real children. We soon discovered that the more playful and weird a horn was, the more eager the children were to try it on.

One of the advantages of Agent Unicorn is the chance to gather data when a child is in a mental state closer to that of their day-to-day life, due to the more natural environments that the headpiece can be used in, and the agency it offers children. Instead of a doctor measuring their brain activity in a clinic, while they’re chained to a machine via 64 head-mounted electrodes, children can pick up and place the device on their heads themselves while in a playroom or classroom.

The Agent Unicorn headpieces also have LEDs that flash during P300 events. This can make a therapist’s job easier by highlighting moments when a child becomes especially attentive, and this can suggest points of departure for nonpharmaceutical therapeutic measures.

While clinical studies are in progress to confirm the value of using Agent Unicorn to treat children with ADHD, G.tec has recently released a US $1,100 version with the Unicorn EEG for scientists, makers, and artists. This version comes without the camera horn, and uses a soft skullcap to hold and position the electrodes. A Bluetooth connection transmits data for real-time analysis using a number of different programming languages.

My aim is to create a learning system that brings more self-awareness to wearers and helps them identify subconscious brain activity. Anyone can wear this to measure their activity and to draw conclusions from it. For example, how do we react to various colors? Does seeing a piece of chocolate produce a spike? Or how do we respond to hearing our own names? If you want to try the device yourself without buying it, join us at one of the BR41N.io hackathons for BCI designers at venues around the world!

This article appears in the June 2019 print issue as “A Brain Interface to Capture Your Attention.”

June 1878: Muybridge Photographs a Galloping Horse

Thursday, May 23rd, 2019

Shutter speed rose from a thousandth of a second in 1878 to a millionth of a billionth of a second in the ’90s. Today, that’s considered slow

Eadweard Muybridge (1830–1904), an English photographer, established his American fame in 1867 by taking a mobile studio to Yosemite Valley and producing large silver prints of its stunning vistas. Five years later he was hired by Leland Stanford, then the president of the Central Pacific Railroad, formerly the governor of California and latterly the founder of the eponymous university in Palo Alto. Stanford—who was also a horse breeder—challenged Muybridge to settle the old dispute about whether all four of a horse’s legs are off the ground at one time during a gallop.

Muybridge found it difficult to prove the point. In 1872 he took (and then lost) a single image of a trotting horse with all hooves aloft. But he persevered, and his eventual solution was to capture moving objects with cameras capable of a shutter speed as brief as 1/1,000 of a second.

The conclusive experiment took place 141 years ago, on 19 June 1878, at Stanford’s Palo Alto farm. Muybridge lined up thread-triggered glass-plate cameras along the track, used a white-sheet background for the best contrast, and copied the resulting images as simple silhouettes on a disc rotating in a zoopraxiscope, a device he invented in order to display a rapid series of stills to convey motion. Sallie Gardner, the horse Stanford had provided for the test, clearly had all four hooves off the ground. But the airborne moment did not take place as portrayed in famous paintings, perhaps most notably Théodore Géricault’s 1821 Derby at Epsom, now hanging in the Louvre, which shows the animal’s legs extended, away from its body. Instead, it occurred when the horse’s legs were beneath its body, just prior to the moment the horse pushed off with its hind legs.

This work led to Muybridge’s magnum opus, which he prepared for the University of Pennsylvania. Starting in 1883, he began to make an extensive series depicting animal and human locomotion. Its creation relied on 24 cameras fixed in parallel to the 36-meter-long track and two portable sets of 12 batteries at each end. The track had a marked background, and animals or people activated the shutters by breaking stretched strings.

The final product was a book with 781 plates, published in 1887. This compendium showed not only running domestic animals (dogs and cats, cows and pigs) but also a bison, a deer, an elephant, and a tiger, as well as a running ostrich and a flying parrot. Human sequences depicted various runs and also ascents, descents, lifts, throws, wrestling, dances, a child crawling, and a woman pouring a bucket of water over another woman.

Muybridge’s 1,000 frames a second soon became 10,000. By 1940, the patented design of a rotating mirror camera raised the rate to 1 million per second. In 1999, Ahmed Zewail got a Nobel Prize in Chemistry for developing a spectrograph that could capture the transition states of chemical reactions on a scale of femtoseconds—that is, 10-15 second, or one-millionth of one-billionth of a second.

Today, we can use intense, ultrafast laser pulses to capture events separated by mere attoseconds, or 10-18 second. This time resolution makes it possible to see what has been until recently hidden from any direct experimental access: the motions of electrons on the atomic scale.

Many examples can be given to illustrate the extraordinary scientific and engineering progress we have made during the past 141 years, but this contrast is as impressive as any other advance I can think of—from settling the dispute about airborne horse hooves to observing flitting electrons.

This article appears in the June 2019 print issue as “June 1878: Muybridge’s Galloping Horse.”

The Last Working Olivetti Mainframe Sits In a Tuscan High School

Thursday, May 23rd, 2019

How an encounter with the ELEA 9003 inspired a tech historian’s career

About 10 years ago, Elisabetta Mori and some friends were doing research for an art exhibit on the theme of “archives of memories.”

“We approached the theme literally, and so we looked for old examples of physical memories—computer memories,” Mori recalls. “We tried to see the oldest computers built in Italy.” At the Museum of Computing Machinery in Pisa, they saw the Calcolatrice Elettronica Pisana, an early digital computer built by the University of Pisa in 1957 with the support of the Olivetti company. But the machine had long ago stopped working.

Then they heard about a working model of the ELEA 9003, Olivetti’s first commercial mainframe, introduced in 1959. They lost no time tracking it down.

This 9003 had originally belonged to a bank in Siena, where it was used for payroll, managing accounts, calculating interest rates, and the like. In 1972, the bank donated the computer to a high school in the Tuscan hill town of Bibbiena. And there it’s been ever since. Today, former Olivetti employees periodically travel to the ISIS High School Enrico Fermi to tend to the machine.

The mainframe’s sleek aluminum modular racks and peripherals occupy a large room, with Olivetti typewriters and calculators spread around the space. The technicians keep spare parts on hand, as well as original manuals and blueprints.

The encounter with the computer changed Mori’s life. She wrote a master’s thesis about it. Now, she is a Ph.D. candidate in the history of computing at Middlesex University in London. Mori’s article, “The Italian Computer: Olivetti’s ELEA 9003 Was a Study in Elegant, Ergonomic Design,” describes the company’s heroic effort to launch the ELEA 9003. [In the photo at top, Mori is seated at the 9003’s console.]

“The machine works, but it is fragile,” Mori says. The computer contains more than 40 kilometers of copper cable wrapped in woven glass fiber. “If you don’t run the computer regularly, it will stop working. If you move it, it will die.”

To forestall that eventuality, a local group called the Associazione Amici dell’Olivetti ELEA 9003 is raising funds to hire and train workers to maintain the computer. You can reach them at amicielea9003@gmail.com.

“Until I saw it working, I didn’t realize how complex, fascinating, and noisy these early computers were,” Mori says. “I would have missed one big part of the story.”

New Optimization Chip Tackles Machine Learning, 5G Routing

Wednesday, May 22nd, 2019

A 49-core chip by Georgia Tech uses a 1980s-era algorithm to solve some of today’s toughest optimization problems faster than a GPU

Engineers at Georgia Tech say they’ve come up with a programmable prototype chip that efficiently solves a huge class of optimization problems, including those needed for neural network training, 5G network routing, and MRI image reconstruction. The chip’s architecture embodies a particular algorithm that breaks up one huge problem into many small problems, works on the subproblems, and shares the results. It does this over and over until it comes up with the best answer. Compared to a GPU running the algorithm, the prototype chip—called OPTIMO—is 4.77 times as power efficient and 4.18 times as fast.

The training of machine learning systems and a wide variety of other data-intensive work can be cast as a set of mathematical problem called constrained optimization. In it, you’re trying to minimize the value of a function under some constraints, explains Georgia Tech professor Arijit Raychowdhury. For example, training a neural net could involve seeking the lowest error rate under the constraint of the size of the neural network.

“If you can accelerate [constrained optimization] using smart architecture and energy-efficient design, you will be able to accelerate a large class of signal processing and machine learning problems,” says Raychowdhury. A 1980s-era algorithm called alternating direction method of multipliers, or ADMM, turned out to be the solution. The algorithm solves enormous optimization problems by breaking them up and then reaching a solution over several iterations.

ABB & Siemens Test Subsea Power Grids for Underwater Factories

Wednesday, May 22nd, 2019

Putting a power-distribution station on the ocean floor could allow more raw materials to be processed down there

Slowly but surely, oil- and gas-drilling technology is migrating from floating platforms to the seafloor. Pumps moved down there decades ago. More recently, compressors (which boost pressure in a well to keep gas flowing) and separators (which isolate oil from water and silt) have relocated to the murky depths.

Putting this equipment closer to wells makes them more productive and energy efficient. Some oil and gas companies even aspire to build subsea factories that extract and process oil and natural gas directly on the seafloor. These factories would be safe from hazards such as icebergs and hurricanes. They would be controlled remotely, reducing labor costs. Eventually, some believe, offshore platforms could be phased out entirely.

However, all of this sunken gear requires electricity. Today, operators typically string power lines from power plants or diesel generators aboard nearby oil rigs to every piece of subsea equipment they install. That works for a few machines, but it’s impractical to string dozens of umbilicals, as they’re known, to the ocean floor.

Industry suppliers ABB and Siemens are now putting the finishing touches on competing versions of the world’s first subsea power-distribution stations. Once installed, these stations would connect via a single line to a “topside” (maritime parlance for above water) generator, wind turbine, or power plant, and redistribute electricity to underwater equipment. “Our technology is an enabling technology for the subsea factory,” says Bjørn Rasch, head of subsea power for Siemens.

Both projects have been in the works for more than five years. ABB will complete its final round of testing in June and expects to install its first subsea power system in 2020. Siemens tested its version in shallow water in Norway last November and is now talking with clients about putting its first unit in the field. “We’re getting close to where we’re actually deploying this technology in a real project,” Rasch says.

Siemens’s model, which the company calls its Subsea Power Grid, consists of a transformer, a medium-voltage switchgear, and a variable-speed drive. Its distribution voltage is around 30 kilovolts, while its variable-speed drive puts out 6.6 kV. The system can provide electricity to devices with power ratings between 1 and 15 megawatts. The umbilical that hooks it to a generation station also includes an embedded fiber-optic cable so operators can run everything from afar.

One of the hardest parts of building the station, Rasch says, was ensuring it could withstand the high water pressure of the seafloor. Instead of encasing all the equipment in a pressurized chamber, engineers flooded the electronics with a synthetic fluid called Midel. This biodegradable fluid inside the equipment maintains the same pressure as the seawater, which alleviates stress. The fluid also passively cools the device by transferring heat from equipment to the chilly seawater.

Chevron, Eni Norge, Equinor, and ExxonMobile have all worked with Siemens to get the company’s project this far. The next step for both ABB and ­Siemens will be to deliver the first model for installation at an active production site.

Brian Skeels, professor of subsea engineering at the University of Houston and director of emerging technology for the offshore design and consulting firm TechnipFMC, has seen many attempts to “marinize” technologies to work underwater. Dealing with heat is a common stumbling block. If water can’t flow freely around a device, the heat it generates prompts marine life to grow on the equipment, which shortens its life-span. And, Skeels cautions, “what may work in shallow water may not work at deeper depths.”

Both systems are expected to work at depths of up to 3,000 meters and operate for 30 years with minimal maintenance. At the end of their lives, the units can be removed from the seafloor.

A power-distribution center would be just one piece of any future subsea factory—a vision that has captivated the industry for more than a decade. Skeels says the future of subsea processing will depend largely on whether such projects can add more value to the industry than they drain in expense. Investment into subsea processing dried up when oil prices crashed in 2014. Looking ahead, Skeels thinks the technology holds the most potential for remote wells more than 160 kilometers from other facilities.

Hani Elshahawi, digitalization lead for deepwater technologies at Shell, says there are clear benefits to having power readily available on the seafloor. But he doesn’t think subsea factories will supplant all platform activities, or replace any of them in the near future. “It will require decades, in my view,” he says. “We foresee a more gradual and lengthy transition.”

To Rasch at Siemens, though, the industry’s vision of subsea factories does not seem as far out as it once did. “There are many technologies in many companies that are in place or close to being in place,” he says. “This can be realized in the close future, that’s for sure.”

This article appears in the June 2019 print issue as “ABB and Siemens Test Subsea Power Grids.”

This article was updated on 22 May 2019.

Get tips to develop your DAQ test systems

Tuesday, May 21st, 2019

Reduce your test development time, increase throughput and improve the accuracy of your test systems

There is a growing trend across all industries to design feature-rich products. You need to thoroughly test your product while meeting market windows and project deadlines. Learn how a data acquisition system could help you achieve all of these goals in this Ebook entitled, Four Things to Consider When Using a DAQ as a Data Logger

img

Do it in Software or Hardware? VR Audio Takes on the Perennial Question

Tuesday, May 21st, 2019

Dirac creates immersive audio for VR with algorithms while Harman patents flying robotic speakers

Hardware and software engineers, as the perennial lightbulb jokes point out, see the world differently. And indeed, a number of problems can be solved either way; the choice is not always obvious, at least at the time. The 737 Max design, reports are now indicating, turned out to involve a series of choices between solving problems in hardware or software.

Product designers are solving less critical problems all the time for which there are both hardware and software solutions. One of the more recent examples: how do you create realistic fully immersive audio for virtual reality that adjusts seamlessly as the user moves about the virtual world? Garden variety surround sound seems immersive, but generally doesn’t change as you move in relation to the speakers in a room, so the virtual experience can easily break apart.

Dirac, a company that develops digital signal processing techniques for optimizing audio, thinks the solution is software-centric. The company’s researchers measured what are called head-related transfer functions (HRTFs) for 30 people. HRTFs encode changes to a sound that occurs due to reflections and shadows caused by the shape of the head, ears, and torso. The researchers collected hundreds of HRTF measurements, instructing the subjects to move moved their heads around in relation to the rest of their bodies, and used these measurements to calculate a generic, dynamic, HRTF model. Virtual reality content products can use this model to make the sounds of a virtual environment respond to a user’s movements and appear to come from the virtual audio source, even when the actual environment contains just two speakers or headphones.

Meanwhile, audio equipment-maker Harman International (now part of Samsung) has sketched out a solution that requires lots of hardware. In a patent application for a mobile speaker system for virtual reality environments, Harman describes flying robotic speakers that reposition themselves as a user navigates a virtual world. The system includes a processor that maps the virtual environment to the user’s physical environment, “transmitting information to the robotic speaker to cause the robotic speaker to move to a location in a physical environment that corresponds to a location of the virtual object in the virtual environment, and transmitting an audio signal to the robotic speaker. The robotic speaker outputs, based on the audio signal and at the location in the physical environment, the sound associated with the virtual object.”

The Harman engineers have a number of ideas for how these robotic speakers would move around, including wheels, rotors, pulleys, and gas propulsion. If nothing else, such a system would certainly make watching someone navigate a virtual experience more entertaining.

X-ray Detection May Be Perovskites’ Killer App

Monday, May 20th, 2019

The wonder crystal could yield imagers that are far more sensitive than commercial detectors

The crystalline material known as perovskite makes for a superefficient photovoltaic cell. Researchers are also exploring perovskites’ potential in transistors and LED lighting. But there’s yet another use for this wonder crystal, and it may be the most promising of all: as X-ray detectors.

Dozens of groups around the world are exploring this area, and major X-ray imaging manufacturers, including Samsung and Siemens, are considering perovskite for their next-generation machines. Compared with today’s X-ray imagers, detectors based on perovskite compounds are far more sensitive and use less power. And for certain applications, the materials can be tuned to emit color when irradiated. Lab prototypes of imagers that use perovskite have been demonstrated to be at least 100 times as efficient as their conventional counterparts.

“Interest in perovskite crystals for imaging emerged out of all the recent enthusiasm to get better solar panels,” says I. George Zubal, director of the nuclear medicine and computed tomography programs at the National Institute of Biomedical Imaging and Bioengineering (NIBIB), in Bethesda, Md. His program funds research into new imaging devices, procedures, and software, including groups looking at perovskite X-ray detection.

What makes perovskites so useful for X-ray detection is the same thing that makes them good for solar cells: They’re excellent at converting light into electrical charge. In a direct detector, X-ray photons are converted into electrons inside a semiconductor. In a scintillator imager, the X-ray photons are first converted into visible light, which is then converted into electrons by a photodiode array.

Conventional direct X-ray detectors have higher resolution than do scintillators, but they take longer to acquire an image. That’s because the semiconductor material they typically use—amorphous selenium—isn’t great at stopping X-rays. Scintillator imagers, on the other hand, are more sensitive than direct X-ray imagers—meaning you need fewer X-rays to create the image—but yield a lower-quality image.

Perovskites could be the answer to the main shortcomings of current X-ray imagers, says Zubal. “Perovskite stops a lot more of the X-rays [compared to amorphous selenium], and being a semiconductor it should give us higher-resolution images, showing the small structures of objects…. You’re also lowering the radiation dose to the patient, which is another main reason for the NIBIB’s enthusiasm.”

In one experiment, Xiaogang Liu’s group at the National University of Singapore started with a commercial flat-panel X-ray detector that used bulk scintillators of cesium iodide thallium. The group removed the CsI(TI) layer and replaced it with a layer of nanocrystals of cesium lead bromide—an inorganic perovskite—directly coating them onto photo­diode arrays. When coupled with ­photomultiplier tubes, the resulting device had a detection limit that was just 1/400 that of medical X-ray machines, as the group reported in Nature last September. Several X-ray manufacturers are now testing nanocrystal scintillators using his group’s approach, Liu says.

Liu credits grad student Qiushui Chen for coming up with the idea of using perovskite nanocrystals in this way. “A lot of our recent work involves rare-earth materials, which is what conventional scintillators use,” Liu says. To form the perovskite layer, the researchers mixed the nanocrystals with liquid cyclohexane and then spin-coated the mixture onto a flexible substrate.

“We got a little bit lucky, because we discovered that the nanocrystals had to be deposited on the substrate through a solid-state process,” Liu says. “If the particles are dispersed in solution, it’s no good.”

Researchers have also demonstrated perovskites in direct X-ray detectors with vastly superior performance to that of commercial imagers. In general, says the NIBIB’s Zubal, direct X-ray detectors are “highly more desirable” than scintillators because they avoid the extra step of converting visible light into electrons. The projects that NIBIB is supporting involve direct detection.

Jinsong Huang and his group at the University of North Carolina at Chapel Hill have been studying direct X-ray detectors based on perovskites since 2014. (Huang also works on perovskite photovoltaics.) In one experiment, they coated methylammonium lead tribromide—a common perovskite compound—onto a regular X-ray detector that used amorphous silicon to convert the X-rays to electrons. The addition of the perovskite layer made it 3,000 times as sensitive.

“When you want extremely efficient and sensitive detectors, you need to count single photons, and that’s not easy,” Huang explains. “We showed that we can make materials that allow you to distinguish the signal from the noise.” Huang recently created a startup to commercialize radiation detectors based on his group’s work.

There are still a number of hurdles to cross before perovskite scintillators or direct X-ray imagers will be ready for market. A big obstacle is that some perovskites are sensitive to moisture. Liu has developed a method for coating each nanocrystal with silicon dioxide and is exploring other protective methods. Perovskite layers can also be encapsulated in glass, much like traditional solar cells are.

But in general, perovskite X-ray imagers won’t need to be quite as hardy as perovskite PVs or LEDs, because the environmental conditions they’ll face are more benign. Solar panels need to perform even after being exposed to the elements for 20 years, while LEDs are exposed to heat and, of course, light, both of which can degrade a perovskite compound. X-ray machines, by contrast, are typically used in climate-controlled settings. For that reason, Liu and Huang believe perovskite X-ray detectors will be commercialized much more quickly than other perovskite applications.

Huang predicts that perovskite detectors will open up new applications for X-rays, expanding what’s already a ­multibillion-dollar industry. More efficient imagers would draw less power, lending themselves to portable machines that run on batteries. Liu’s group has also demonstrated a variety of tunable, color-emitting perovskite nanocrystals. That work could lead to multicolor X-ray displays, which are impossible with today’s scintillator X-ray machines.

And because they use flexible substrates, perovskite imagers could conform to whatever’s being scanned; anyone who has experienced the discomfort of a mammogram will appreciate that feature. Faster, more sensitive imagers would also reduce the radiation from dental and medical X-rays and airport security scanners.

“Once we can make X-rays much safer, the market will change because you’ll be able to put the detectors everywhere,” Huang says.

How the United States Plans to Update Its Nuclear Arsenal

Monday, May 20th, 2019

Vice Admiral Dave Kriete explains how the United States maintains its stockpile, and why the nation is developing new low-yield nuclear weapons

Earlier this year, at a sprawling complex in the Texas Panhandle, a new type of nuclear weapon began rolling off the production line and into the United States arsenal. The ballistic missile warheads are low-yield and relatively small, and they reflect a growing push by U.S. President Donald Trump’s administration to modernize the nation’s nuclear weapons program after decades of stagnation.

Vice Admiral Dave Kriete has played a key role in developing U.S. nuclear weapons policies. Since June 2018, he has served as deputy commander of U.S. Strategic Command, the military unit responsible for detecting and deterring nuclear, space, and cyber attacks against the United States and its allies. Kriete also helped craft the 2010 and 2018 Nuclear Posture Reviews—the Pentagon’s guiding document for U.S. nuclear policy, strategy, and capabilities.

Kriete, who is based in Omaha, Nebraska, spoke with Spectrum during a recent visit to New York City. He discussed plans for nuclear weapons modernization, and the challenges to achieving them. This conversation has been edited and condensed for clarity.