Quantcast
Channel: All Posts
Viewing all 2658 articles
Browse latest View live

Satyen Sangani: We Are in the Dark Age of Data. It’s Time to Evolve

$
0
0
Big Data gif copy

Data is only as good as the insights we can gain from it. We need to boost data literacy.

For those of us who champion the power of data, the past five years have been an incredible ride thanks to the rise of big data. Consider just these three examples: by 2020, we will have created as many digital bits as there are stars in the universe; data drove President Obama’s wins in the 2008 and 2012 elections; and data is powering the incredible rise of new companies like Uber and AirBnB, allowing people to monetize their most illiquid fixed assets like cars and houses.

Of course, data hasn’t accomplished any of this. Data isn’t the protagonist in any of the stories above. Humans are. People use data. Data can show correlations and trends, but people have insights that suggest cause and effect. Insights are what enable better decisions and drive innovation.

And here’s the catch: in spite of our recent data-driven achievements, the evidence suggests that humans may well be in the dark ages of data. McKinsey, in their broadly read Big Data report, estimates that there will be only 2.5 million data-literate professionals in the United States in 2018 — less than 1 percent of the projected population. Surveys show that professionals today still take action the old-fashioned way — based on gut instinct, personal experience and what they think they know.

So, with all this data, technology and promise, how do we build a more data-literate world?

Consumption Requires Context

If we think of data as food for our mind, the nutrition movement might offer some clues. Today the state of labelling data for appropriate use is akin to the opaque labelling of food products over 40 years ago. Until relatively recently, we had no idea whether the food we ate contained inorganic products, genetically modified ingredients, lead or even arsenic. Today we have raised nutritional awareness by listing critical ingredients and encouraging nutritional literacy that can assist in making healthy eating a conscious behaviour.

Consuming data appropriately requires the same type of conscious evaluation of ingredients. Here is a relatively common and simple example:

At one large multinational company, it turned out that the Date of Birth field is generally not populated. Rather, it’s defaulted to Jan. 1, 1980. As a consequence, if you did not know this fact and tried to find the average age of your customers, you would come to the conclusion that your customers are younger than they really are. This mistake happens so often that it has created a myth within the institution that they service young customers when their actual customers are typically middle-aged.

Drawing incorrect conclusions from data often does more damage than not using data at all. Consider the spurious relationship between vaccinations and autism or that six of the 53 landmark cancer studies are not reproducible. An Economist survey revealed that 52 percent of surveyed executives discounted data they didn’t understand, and rightfully so. The Economist reminds us that a key premise of science is “Trust, but Verify.” The corollary also holds true — if we can’t verify, we won’t trust.

Packaging Data

No one wants to consume something that they’re not expecting. If someone expects a red velvet cupcake and you feed them pizza, they might live with it, but the initial experience is going to be jarring. It takes time to adjust. So, what does this have to do with data?

Data doesn’t really speak your language. It speaks the language of the software program that produced the information. You say sales, and the dataset says rev_avg_eur. You say France, and the data set says CTY_CD: 4. Can these labels be learned? Sure, but even in a relatively small organization, there might be 20 software programs in use every day, each of which has hundreds of different codes, attributes and tables. Good luck if you are in a multinational organization with tens of thousands of such programs.

This translation has a larger unseen cost. A recent industry study highlighted that 39 percent of organizations preparing data for analysis spend time “waiting for analysts to assemble information for use”. And another 33 percent spend time “interpreting the information for use by others.” If, every time we need an answer, it takes us hours or days to assemble and interpret the information, we’ll just ask fewer questions — there are only so many hours in a day. Making data easy to consume means ensuring that others can easily discover and comprehend it.

A Data-Literate World

We have an incredible opportunity in front of us. What if just 5 percent of the world’s population were data literate? What if that number reached 30 percent? How many assumptions could we challenge? And what innovations could we develop?

According to Wikipedia, the skills required to be data literate include understanding what data means, drawing correct conclusions from data and recognizing when data is used in misleading or inappropriate ways. These are the decoding skills that enable an individual to apply data analysis accurately to decision-making. Rather than focusing on making data consumers do more work, maybe we can boost literacy by surrounding the data with context and reducing the burden of understanding the information.

Metrics and statistics are wonderful, but we need to surround data with more context and lower the costs of using data. More fundamentally, we have to reward those people and systems that provide this transparency and usability. Data is just information — we need to evolve in how we use it to unlock its potential.

This piece first appeared in the World Economic Forum’s Agenda blog.

 

Sangani headshotSatyen Sangani is the CEO of Alation, a World Economic Forum Technology Pioneer

 

 

 


Go with Current: GE’s New Energy Business Changes the Power Game

$
0
0
Florianopolis_photo2-Denise (1)

GE announced today the creation of Current, a startup that combines energy hardware with a digital backbone to make power simpler and more efficient for customers.

The company, which is backed by GE’s balance sheet, brings together GE’s LED, Solar, Energy Storage and Electric Vehicle businesses as a one-stop shop for early customers like Walgreens, JPMorgan Chase, Hilton Worldwide and others.


Based in the Boston area, Current reflects GE’s shift toward new business models that address customers’ end goals. Instead of selling separate energy products, Current will combine power offerings as a service and reduce cost for companies by simplifying energy and providing improved efficiency. It will use GE’s Predix software platform to collect data and help customers understand how they’re using, and losing, energy.

The goal is to cut waste for customers who, for example, power millions of square feet of factory space or retail operations. The services are projected to save customers 10-20 percent on their energy bills and also help utilities manage demand on the grid.

“Commercial enterprises can’t afford complexity and inefficiency in energy solutions if they are to remain competitive,” says Beth Comstock, a vice chair of GE who oversees Business Innovation. “They are looking for ‘future proofed’ solutions. From the socket to the grid, we understand how the electrons flow and have the unique position to optimize energy regardless of the scenario or customer.”

GE_Current_Infographic_100615_847pmJB

Current builds on the history of GE founders like Elihu Thomson, who merged his Thomson-Houston Electric Co. with Thomas Edison’s Edison Electric Co., to create GE. Thomson’s inventions laid the groundwork for the modern electric grid.

The new company, which will be headed Maryrose Sylvester, the president and CEO of GE Lighting, begins life with $1 billion in revenue. It will create approximately 200 new jobs.

 

 

High-Altitude Science Reveals Secrets of Glowing Plasma

$
0
0
High-Altitude Science Reveals Secrets of Glowing Plasma 0

There’s a lot of science happening in the bowels of the International Space Station 249 miles overhead. Astronauts are chowing down on experimental salads grown from LEDs and hydroponics. Silkworms are being bombarded with cosmic radiation to see how they react.
In June, in a laboratory called Plasma Krystall-4 (PK-4), scientists also started unlocking the secrets of plasma– ionized gases that make up the fourth state of matter and 99 percent of all visible material in the universe. More close to home, a common form of plasma glows inside neon signs.

image

A plasma ball

.PK-4 is a collaboration between the European Space Agency and the Russian Federal Space Agency. The researchers are relying on a rugged GE computer design to study a special subgroup of plasma called complex plasmas. This is the second ISS research project using GE technology. Since 2011, astronauts have been using an ultrasound system made by the company’s healthcare unit for cardiac, muscle, vessel, and blood flow analysis.

Complex plasmas are a low-temperature mixture of microparticles and ionized and neutral gases found throughout space. Researchers hope that learning more about this material will contribute to the fundamental understanding of nature and better designs for returning spacecraft, whose heat shields generate plasma as they reenter Earth’s atmosphere.

image

Top: The International Space Station is orbiting 249 miles above Earth. Above: PK-4 control and video unit during an in-orbit installation. Image credit: ESA/ROSCOSMOS

Such complex plasma experiments can’t be performed on Earth in the same way because the planet’s gravity distorts the results by acting on the suspended microparticles. But observing the material in microgravity will help scientists better understand the fundamentals of how plasma naturally flows in space. Researchers hope PK-4 will reveal some of the mysteries of so-called plasma crystals, which form when microparticles like dust become highly charged by exposure to ionized gas. Charged particles then start interacting and self-organize into crystal structures.

“Complex plasmas are studied in gas discharges at low pressures,” writes Dmitry Zhukhovitskii, a physicist at the Russian Academy of Sciences who is working on the PK-4 experiments. “Under microgravity conditions, large volumes of 3D complex plasma can be observed. These conditions are realized either in parabolic flights or onboard the International Space Station.”

image

Above: PK-4 started running research in June 2015. Image credit: NASA/ESA

At the heart of PK-4 are two GE CR11 units – single board computers ruggedized by the company’s Intelligent Platforms business. They can perform in harsh environments where previously it would be unthinkable to put delicate electronics—on oil and gas platforms, inside military equipment and in space.

The computers running the PK-4 experiments are built to record massive amounts of video at 130 megabytes per second and automatically execute a list of commands to perform the science and display video from the experiments.

“This is a great example of what GE Rugged is all about,” said Chris Lever, general manager for embedded systems at GE’s Intelligent Platforms business. “Whether it’s in the harsh environment of a heavy manufacturing facility, a railroad locomotive, onboard an armored vehicle – or, as it is here, out in space – GE’s solutions are designed to operate with absolute reliability wherever they are deployed, in whatever conditions.”

image

The International Space Station is orbiting 249 miles above Earth.

Rubin Dhillon, director of marketing for the embedded computing division of GE’s Intelligent Platforms business, says there are a number of fundamental changes that need to be made to off-the-shelf electronics to make sure they keep operating when subjected to heavy vibration, shock, radiation, temperature swings and humidity.

“A very simple example is soldering components to the printed circuit board, rather than the more common approach of inserting those components into sockets,” Dhillon says. “A component that’s soldered on will stay where it’s put, however much shock and vibration it’s subjected to. It’s a more expensive way of doing things – but if a computer is mission-critical, no company wants to skimp.”

He says that highly specialized missions like those aboard the ISS show rugged GE computers can perform “sophisticated, demanding applications that require huge amounts of data to be processed very fast, coupled with the ability to operate with absolute reliability under the most challenging of conditions.”

Says Dhillon: “If we can build computers good enough for that, you can bet we can build computers good enough for pretty much anything anyone wants to throw at them.”

The Heat Is On: How New Horizons Got Its Power

$
0
0
The Heat Is On: How New Horizons Got Its Power 0

Feature by feature, they revealed themselves: the plains of Sputnik, the Norgay Montes and the vast and forbidding Cthulu Regio.

When the New Horizons spacecraft finally buzzed Pluto at roughly 30,000 mph last summer, it sent back snaps of an untamed land of craterless plains and jagged ice mountains beyond our imagining. And those pictures of the dwarf planet traveled the expanse of space thanks to a 125-pound power plant that doesn’t know the meaning of quit.

It’s called the RTG, or Radioisotope Thermoelectric Generator.

Originally designed by GE’s Space Division (now part of Lockheed Martin) in King of Prussia, Penn., this model of RTG has powered U.S. spacecraft since the Ulysses probe was launched in 1990. The electricity from the RTG doesn’t propel the spacecraft, which use inertia from the launch and gravity slingshots around planets, but it’s necessary for the missions to snap pics, gather data and phone home.

imageAbove: An artist’s rendition of the New Horizon spacecraft approaching Pluto’s moon Charon. Pluto is in the Background. Image credit: Getty Images Top: Pluto photographed by New Horizons on July 13, 2015. Image credit: NASA

The RTG takes advantage of the predictable decay of Plutonium 238, a radioisotope produced by certain nuclear power plants. The heat given off by the plutonium, which is in the form of 18 fire-resistant ceramic pellets, is transformed into electricity by a process known as the Seebeck effect.

Discovered by the 19th century German physicist Thomas Johann Seebeck, the effect describes how heat can be turned into electricity when two different conductive materials in a closed circuit called a thermocouple are kept at different temperatures. In a spacecraft, this means that the “hot” junction absorbs the heat from the plutonium while the “cold” junction is kept frigid by the near-absolute zero of space. The temperature difference between the two junctions results in an electrical current.

The internal heat source is necessary in space, where the sun’s energy is too weak to power solar panels. The RTG provided about 250 watts of power when New Horizons launched in 2006. But the device loses 5 percent of its power every four years and outputs about 200 watts currently.

image

GE was a critical part of the space race from its earliest days, working on computers, lunar landing modules, moonboots and even astronaut escape pods. But the RTGs were the most lasting contribution.

The older model RTGs GE engineered for the Voyager 1 and Voyager 2 spacecraft are still providing power for the craft, which have made it further from home than any manmade object in history.

The production of the thermocouples was stopped after the Voyager program in the 1970s sand GE had to kickstart it for the Galileo missions. The new model RTG, called a GPHS, or general-purpose heat source, used a different method to pack the plutonium fuel than the Voyager craft.

GE got out of the space business altogether in 1992. The company would still probably honor any service obligations, but the product is now a bit out of reach.

Ladies and Gentlemen, Upload Your Engines: GE’s First Chief Digital Officer Ganesh Bell Believes that Hardware is the Future of Software

$
0
0
Ganesh motorcycle

Silicon Valley veteran Ganesh Bell believes in the power of software so much that he decided to join one of the world’s largest industrial companies. “After two decades in the software business and working in the Valley, I’ve heard Marc Andreessen say that software was going to eat the world and it clicked,” he says. “I realized the next software company wouldn’t be a software company at all. Everyone has access to cloud, big data, and software talent. It’s the companies with deep industry domain in machines, infrastructure and operations expertise that will have the upper hand. The future of software is building digital businesses.”

It’s a bold notion, but in an era when everything from fitness monitors to subsea blowout preventers is getting connected to the Internet, also prescient one. Bell now works as chief digital officer at GE Power & Water, the first such C-suite executive in GE’s 123-year history. Although he has the affable mien of a hoody-clad West Coast software entrepreneur, beneath that easygoing veneer flickers the flinty intensity of a motorbike racer who likes to push himself hard at the track. “I’ve always believed that in order to succeed, you have to commit to being uncomfortable,” he says.

gb2

Top image: Ganesh Bell on his Aprilia bike at the race track. Image courtesy of Ganesh Bell. “It’s the companies with deep industry domain in machines, infrastructure and operations expertise that will have the upper hand. The future of software is building digital businesses,” says GE’s first Chief Digital Officer Ganesh Bell. Image credit: GE Reports

Bell’s got the record to prove it. Two years ago, after he left a pure play Silicon Valley software job, he spent six months travelling the world and telling everyone about his epiphany. Then one day there was an email from GE in his inbox. “It said, ‘We’ve heard you, we’re intrigued, we should talk,’” he says. “At that point, I had to decide whether I really believed what I was saying or whether it was just hubris. The stakes got pretty high when the world’s largest industrial company came knocking. So I said yes.”

Bell’s chief domain is the company’s multi-billion energy business making turbines, generators, transformers other power machinery. “The entire value chain of energy must go digital or you’re going out of business,” he says. “There is no other way.”

GE Power & Water, not to mention other GE units, had never had a digital officer before. That’s partly because the business of making electricity and sending it to customers remained fundamentally unchanged from its earliest days. But in the last decade, when utilities started adding wind, solar and other renewable sources of energy, the rigid grid system suddenly had to become more flexible. It had to accommodate power from the sun, which doesn’t always shine, and the wind, which doesn’t always blow. Meanwhile at home, people also started putting solar panels on their homes, using their own generators and even storing and trading power from batteries. “It’s like going from juggling one ball to juggling 10,” Bell says. “We need to use software to keep all the balls in the air.”

Bell has his office at GE’s software headquarters in San Ramon, Calif. Late last month, he was at GE’s Minds+ Machines conference in nearby San Francisco, where is business just launched the world’s first digital power plant that has the potential to save customers $300 million over its lifetime. Earlier this year, GE also came out with the first digital wind farm, which could lead to $100 million in lifetime savings, and started connecting the dots on transforming all of power generation, grid and consumption with software.

Bell says that customers have been always asking for outcomes like these, but they are now demanding them in ways that are shaking up both software and hardware. “In the past, they wanted better machines, but now they’re demanding outcomes at the enterprise level,” he says. “They want to optimize their entire companies, from business and operations to asset performance.”

DigitalPower_02

“In the past, [customers] wanted better machines, but now they’re demanding outcomes at the enterprise level,” Bell says. “They want to optimize their entire companies, from business and operations to asset performance.” GIF credit: GE Reports

Optimizing the business process has been traditionally the domain of enterprise software behemoths like SAP, one of Bell’s former employers. It was when he was there, during the advent of cloud computing and using software as a service, that Bell started seeing glimpses of the future. “I discovered that any company with deep domain expertise can start digitizing their knowledge and offer it as a service,” he says. “I felt that at SAP, we could be threatened by our own customers.”

Bell says that GE has an opportunity to quickly grab a big share of the quickly growing digital industrial service market, which is estimated to top $220 billion by 2020. “There’s no other software market of this size anywhere else,” Bell says. “For example, right now, we are using software to optimize a wind turbine. But the solution is getting so powerful that we can optimize an entire wind farm.”

Bell might be the first chief digital officer in GE history, but not for long. The model has gained traction and it’s quickly becoming part of GE’s playbook. Other GE businesses – from healthcare to aviation – are already developing their own digital service offerings. In September, the company also launched a new business called GE Digital that will connect them all.

Bell recently talked about his vision and the importance of seeking insights from different fields and businesses with GE executives at the company’s “corporate university” in Crotonville, NY. When he started out as a software developer in India, programmers working on different project and in different computer languages still had to share time on a single mainframe. “I was the only person who programmed in three languages and thus the only one who could be productive the full eight hours,” he said. “It was then when I realized that specialization was for ants. It was true then and especially today. In the valley, you have to be a polymath and a Renaissance man or woman to really understand what’s going on.”

Being a Renaissance man is something Bell embraces with gusto. When he is not disrupting industries, you can find him riding his Ducati or his Aprilia RSV4 Superbike motorcycles at the track. “You have to push yourself into the uneasiness,” he says. “When I started at GE, I knew a lot about software but I didn’t know squat about power generation. But you learn much more when you’re really uncomfortable, that’s the true growth mindset. Now I can tell you how an air-cooled gas turbine makes electricity. Try me!”

David Schwartz: Big Data Demands Big Security

$
0
0
Technician walking in server hallway

Devising strategies to protect Big Data is as important as analyzing the information.

 

Big data is becoming an increasingly important part of the business plan for companies in many different industries. Analyzing large customer datasets and other kinds of data with tools like Hadoop reporting lets companies save money as well as boost revenue by targeting their marketing better, designing products to better appeal to their customers, make better predictions, and so on.

On the other hand, this rise in the use of big data has coincided with the rise of advanced persistent threats to data security. Big data is not just lucrative to the companies that collect it: it is also worth money to identity thieves and other bad actors. This has given rise to a cottage industry in hacking and cracking. Companies that use big data, especially if that data consists of personal information of customers, are at an elevated risk of drawing hacking attempts. Developing ways to protect that data will prove to be just as important as the data itself.
The last few years have seen hacking capture headlines on a regular basis. Large companies like Target have become targets, with hackers stealing credit card information of millions of customers at a time. Even the U.S. government has been affected. The Federal Office of Personnel Management was breached earlier this year and detailed personal information of several million American citizens was stolen by unknown hackers. These breaches are only the latest of a string of such attacks.

Furthermore, just because the largest companies are the most likely to make the news does not mean that smaller companies are safe. Hackers know that while large companies tend to control more data, small companies have less robust cyber-defenses, leaving them more vulnerable to organized attacks, regardless of whether they have VSAT internet or traditional business-class cable.

There are two main ways companies can make themselves less susceptible to breaches. The first is the soft approach: update security protocols. Many of the most recent breaches have happened not because the attackers overcame the target’s defenses through sheer power, but because they exploited the fact that the target had poor protocol. If workers neglect to change their passwords regularly, for example, an attacker can gain access to the network more easily. Keeping protocols governing password use, requiring proper identification before giving out usernames and passwords, and restricting which kinds of user accounts have permission to access various internal networks and resources can help seal off the company from attacks.

The other form is to simply increase the strength of the defenses the company has. Usually, this involves outsourcing information security to an outside firm. This aspect of information security is more difficult for the company to control, because it is hard to know how much computational resources an attacker might have or what techniques they would use to attempt a breach. In general, however, hackers try to avoid confronting such defenses head-on whenever possible. It is cheaper and easier for them to exploit weak protocols or flaws in vendor software rather than try to overcome defenses with direct attacks.

This problem is only going to become more important as the use of customer data becomes more popular. It is certainly true that big data is far too lucrative for the threat of a breach to dissuade companies from collecting data. However, it does mean that they need to be more careful of the possibility of hacking.

A good portion of preparing for hacks is the process of recovering from a hack. Each company that uses data needs to plan out what it needs to do to recover from a breach, just as it has a plan for a fire or storm. Attacks can and do happen, and it makes the fallout far worse if the company is not ready to respond. A nasty breach can sink an unprepared company because it loses customers’ trust.

Having a plan and a way to reassure customers that they are protected is crucial for responding to a hack after it takes place. That will become a necessary part of the business plan of any company that wants to use big data in the long run.

(Top image: Courtesy of Thinkstock)

This piece first appeared on SmartDataCollective.

 

Schwartz headshotDavid Schwartz is a technology angel investor and startup consultant based out of Aspen, Colorado. In addition to taking several startups to IPO stage, he spends most of his time mentoring young tech entrepreneurs.

 

GE Will Sell its Commercial Lending and Leasing Units to Wells Fargo in the Largest Deal to Date

$
0
0
Capital2

GE’s journey to exit the banking sector and transform itself into the world’s largest digital industrial company crossed a major milestone today when the company announced it signed an agreement to sell financial assets valued at $30 billion to Wells Fargo & Co.

This is the largest deal yet since last April, when GE said it would sell the majority of GE Capital’s holdings.

Keith Sherin, chairman and CEO of GE Capital, said the transaction was “a critical step” in GE’s efforts to reduce the size of GE Capital. “Since our April 10 announcement, we’ve signed more than $126 billion in transactions, which is over 60 percent of our overall plan, and are on track to become less than 10 percent of GE’s earnings as the company transitions to a more focused digital industrial company,” he said.

Capital1

GE Capital will sell most of its assets, but it will keep the units that directly support GE’s businesses like aviation and healthcare. Image Credits: GE Reports

Just before the news came out, Nick Heyman, analyst William Blair, upgraded GE stock to a 12-month price target of $32, up 14 percent from this morning. “…We sense that few fully perceive what GE will become and how different the sources of its growth will be,” Heyman wrote in his report. “We believe that GE is approaching the inflection point in its transformation where internal company actions rather than external factors will increasingly govern the speed and ability with which GE completes its transition into the premier base infrastructure company in the world.”

The portfolio sold to Wells Fargo includes GE’s global Commercial Distribution Finance (CDF), North American Vendor Finance and Corporate Finance businesses. The CDF business provides customized inventory financing to fund the flow of finished durable goods from manufacturers. GE Capital’s Vendor business is a leading provider of private label and co-branded programs for OEMs, dealers and end users. The Corporate Finance business is a provider of senior secured lending and leasing solutions.

Captail3

GE announced in April its plans to focus on high-return industrial businesses and shrink GE Capital through the sale of most of its assets by the end of 2016. The financing units that directly support GE’s businesses like aviation and healthcare will remain with GE.

Sherin said that as part of the April plan, GE Capital will look to de-register as a Systemically Important Financial Institution. “Once the U.S. transactions have closed and the split off of GE Capital’s retail finance business, Synchrony Financial, has occurred, GE Capital expects to file an application in 2016 for de-designation as a Systemically Important Financial Institution as its footprint in the U.S. will be significantly reduced,” Sherin said. “Globally, GE Capital expects to be substantially done with its exit strategy by the end of 2016.”

Hybridized, Ultra-Efficient Fuel Cells Poised to Power the Future

$
0
0
fuel cell slow loop crop

The emerging Age of Gas, the rise of distributed power and technological innovation will accelerate the adoption of fuel cells into the global energy landscape over the next decade.

 

The future of power is now under development and it is packaged in stacks of fuel cells. As the world’s energy needs increase, and calls for sustainability continue to rise, a number of forces are converging to expedite the adoption of fuel cells into the global energy mix.

In many ways, the adoption of fuel cells will mirror the renewable energy transition that has taken place over the last decade. Similar to the latter’s progression from niche applications to mainstream power technologies, fuel cells, too, are set to become a part of the global power portfolio in the decade ahead.

How does the technology work? Fuel cells convert the chemical energy in natural gas or hydrogen into electricity, heat and water through an electrochemical reaction with oxygen. While similar to batteries in structure, they rely upon an external fuel source instead of stored chemical reactants. Fuel cells are a form of power production that is modular, easily scalable, yields very low emissions, and is highly efficient.

Although there is approximately 570MW of installed fuel cell capacity across the globe today, expensive product costs have hindered greater industry growth. As explained in a paper, GE’s fuel cell technology, and specifically the process by which each individual fuel cell is manufactured, is poised to overcome this obstacle and spark future market growth.

The Age of Gas

After decades on the margin of the global power system, natural gas is shifting from a regional and marginal fuel to becoming a focal point of the global energy landscape. It now competes head-to-head with oil and coal, and complements wind and other renewable energy resources. At the global level, natural gas production and consumption is growing, in part because the land-based and seaborne networks that underpin the connection between supply and demand are becoming more diverse as they expand around the world. Gas network growth, coupled with technology innovation, is contributing to greater availability, delivery flexibility and improved economics. Natural gas fuel cells such as the FC-CC are poised to benefit from this emerging age of gas, which promises greater gas availability around the globe and more economically favorable gas prices.

The Rise of Distributed Power

The rise of distributed power is being driven by forces that are propelling the broader decentralization movement: distributed power technologies are more widely available; and they are smaller, more efficient and less costly today than they were just a decade ago. Distributed power systems have lower capital requirements and can be built, and become operational, faster and with less risk than large power plants or new transmission lines. GE expects annual distributed power capacity additions to grow from roughly 150 gigawatts (GW) per year today to 200 GW per year by 2020.

Stationary distributed power fuel cells will be a primary beneficiary as global power networks incorporate an increasing number of distributed generation technologies and migrate toward integrated power networks that contain a combination of both central and distributed power systems connected through sophisticated physical and digital networks.

Technology innovation

The synergies created by broad and deep connections across global industries have helped spur fuel cell innovation. For example, thermal spraying techniques developed for GE’s Aviation and industrial gas turbine businesses are being reapplied to manufacture solid oxide fuel cells (SOFCs), opening the door to high volume, low cost manufacturing. These SOFCs can then be put together with technologies such as Ecomagination-certified GE Jenbacher gas engines to create a hybrid fuel cell system that we call the Fuel Cell-Combined Cycle (FC-CC). The engine replaces the conventional SOFC thermal oxidizer used in other designs, generating additional power. The result is projected electrical efficiency in the range of 60-65 percent — something that previously could only be achieved in large centralized power plants. GE is so excited about the FC-CC technology that it has created a nimble start-up, GE-Fuel Cells, to commercialize it.

In a world that is increasingly characterized by environmental constraints and a rising tide of natural disasters, FC-CC’s benefits — such as reduced environmental emissions, net water production, high efficiency, dispatchability and reliability — make it an ideal technology for today’s changing energy landscape and the power needs of tomorrow.  Taken together, these trends tell us that the outlook for fuel cells is positive.

(Top GIF: Video courtesy of GE)

 

brandon_thumb2Brandon Owens is the Strategy and Analytics Director at GE Ecomagination.

 

 

 

McGuinness headshotJohn McGuinness is the Strategic Marketing Leader for GE-Fuel Cells.

 

 

 


Power in the Sewer: One Person’s Wastewater is Another’s Electricity

$
0
0
Energy-Neutral-3

It takes water to create energy and energy to treat water. But wastewater treatment plants are also often one of the biggest users of electric power, sending electricity down the drain.

Engineers at GE, however, have hit on solutions that could solve both problems in one fell swoop. “What if you can have it all,” says Tom Stanley, chief technology officer at GE Water & Process Technologies. “We can help turn wastewater treatment plants from energy consumers into energy producers.”

Stanley says that part of the equation is finding new ways to reduce their energy consumption. One area ravenous for power inside wastewater treatment plants is aeration, a process that brings oxygen into the sludge and helps bacteria break down the organic matter.

Energy-Neutral-1

Top GIF and above: Rather than just pumping it in the water, the reactor is sending oxygen through bundles of hollow fibers to grow a biofilm. GIF credits: GE Power & Water

Today, the average water treatment facility spends as much as much 60 percent its energy on aeration, a clear target for savings.

Stanley’s solution revolves around a technology called membrane aerated biofilm reactor (MABR). Rather than just pumping it in the water, the reactor is sending oxygen through bundles of hollow fibers to grow a biofilm. The fibers transfer it more efficiently to the biofilm and help it neutralize the wastewater pollutants.

The design is so effective that a plant can reduce energy needed for wastewater treatment aeration four-fold. “A great thing about using a MABR is that you don’t have to shut down a facility to achieve this upgrade,” Stanley says. “You can retrofit within the plant’s existing footprint by inserting a MABR into the existing biology tank, enabling higher throughput and improved nutrient removal.”

Once you have addressed energy consumption, you can also start using the solids caught inside the plant to produce energy. The GE design feeds them into an anaerobic digester where another set of bacteria turns them into biogas.

MABR2

Workers are installing a MABR reactor . Image credit: GE Power & Water

The water treatment plant then uses the biogas to fuel a power-generating gas engine or sends it to other users through a pipeline. “Both options are efficient, effective and cost-conscious ways to run a wastewater treatment plant,” Stanley says.

All countries can benefit from the solution, regardless of their GDP. Stanley says that in the U.S. alone, for example, wastewater treatment plants use enough energy to power 2 million homes. The demand will likely grow by 20 percent, following population increase and more stringent regulations. “At the end of the day, the EPA is not coming to a plant asking them to reduce energy,” he says. “But municipalities need to meet effluent quality specs, as well as a budget. With the advances in technology, they can now expand capacity, meet regulatory demands and reduce the energy bill by employing technology investments with a good return.”

The technology could be even a bigger deal in developing countries, where electricity as well as clean water are often scarce. Says Stanley: “For cities in emerging markets that currently have no wastewater plant in place, they have an opportunity to do it right from step one and build a self-sufficient, independent, cost-effective wastewater treatment plant that doesn’t require access to the grid for electricity.”

 

GE_Infographic_wastewater_energy_Revised

How Big Data and the Industrial Internet Can Help Southwest Save $100 Million on Fuel

$
0
0
Southwest Airlines to be launch customer for new Boeing 737 Max aircraft.

Map out one million flight plans each year for Southwest Airlines. Everything from planeloads of chilly Chicagoans heading for vacations in Cancun to budget-minded businesspeople dashing from Los Angeles to New York. It’s difficult. Now try toting up the countless variables on each one of those flights, such as the air’s humidity and the fuel load on each leg, in hopes of accurately calculating their impact on the bottom line.

It’s a task that was once impossible. But now, thanks to the Industrial Internet – a digital network that connects machines like jet engines to software and the data cloud – and a slew of new GE technologies, that’s changing.

_03

Seeking new insight into what’s happening during every flight, Southwest just became the first U.S. domestics airline to use a big data system developed by GE’s Flight Efficiency Services (FES) unit. The system runs on the secure Industrial Internet, using cloud computing and cutting-edge software and analytics. Southwest, which manages a fleet of nearly 700 Boeing 737s, can use its flight analytics to drill down to each individual plane and flight to discover how decisions on each flight may have altered its profitability. Australia’s Qantas also just announced it would start using the system and join existing international customers EVA Air, AirAsia, Swiss International Airlines, Zhejiang Loong Air and SpiceJet. 

“For the first time ever, the airline can look at what they planned to do and what actually happened,” says John Gough, executive engagement leader at GE Aviation Digital Solutions. “This is something that airlines have historically not been able to do because of the vast amount of data involved.” GE’s technology will gather all the data generated by Southwest’s flights, Gough says, and combine it with the airline’s operational and planning data, including details about fuel, passenger and cargo loads, information about the weather and navigational data.

_05

In the U.S. Southwest has pursued this concept for years. GE Aviation’s Digital Solutions business already provides the company with Flight Operational Quality Assurance analytics, a system that captures and analyzes the data generated by an aircraft while it flies from one point to another.

The new technology reaches much further. The tool, which is powered by Predix, GE’s cloud-based industrial software platform, starts with collecting data generated by each Southwest aircraft: wind speeds, ambient temperatures, weight of the plane, maximum thrust and so on. GE applies proprietary techniques and historic intelligence to analyze the data. The Southwest team can then pore over the resulting aircraft performance analytics to find patterns that previously might have been hard to detect. The ultimate goal: transform a torrent of raw data about individual flights into actionable insights that optimize airline operations.

Flight analytics could help Southwest decide whether it should add or subtract flights to some of its routes. But it also aims to identify smaller, subtler improvements. If data shows that planes on a particular route consistently carry too much fuel, reducing fuel loads will not only cut costs, it could allow Southwest to consider selling additional tickets to passengers or taking on more cargo. Reduced costs and increased revenue — those are the lifeblood of any airline.

“This is not just about saving costs but also potentially growing revenues,” Gough says. “Now an airline can understand the cost drivers on each phase of a flight — taxi, takeoff, climb, cruise, descent, approach, landing and taxi back to the gate. Previously they did not have insight into all that.”

For years, Gough says, airline planning was an inexact science because there were too many variables, from the altitude actually flown because of unexpected wind conditions to outside air pressure, speed and weather. Even the weight of otherwise identical aircraft can vary by as much as 2,000 pounds because of each plane’s maintenance history, Gough says. Now real-time data from hundreds of aircraft, coupled with data on everything from weather to navigation, allows airlines to plan more precisely. Gough likened it to medical imaging advances.

“Now airlines can be very surgical in nature,” Gough says. “It’s like the difference between an X-ray and an MRI. We are providing an MRI to an airline’s operations, whereas before they at best had an X-ray.”

The system will also help Southwest identify ways to save fuel. In 2014 airlines wasted $4.3 billion of fuel while planes idled on the tarmac. GE estimates that a 1 percent reduction in jet fuel use could save the global commercial aviation industry $30 billion over 15 years.

For example, an Austin-to-New York flight might plan to cruise at 33,000 feet. But because of turbulence, pilots might decide to move down to 29,000 feet. Now the system will specify how much extra fuel will be burned and what that will cost, giving the pilots more information to make a tactical decision.

With airline profit margins often at just 2 percent of operating expenses, being fuel-efficient is critical. Unlike cars, ships and other, less lofty means of transportation, planes can’t tap alternative energy sources like natural gas and electricity. Fuel amounts to more than 20 percent of the average airline’s operating expenses and is the fastest-rising cost facing airlines, up an incredible 130 percent over the past 15 years. Using GE’s FES, a pilot can make optimized decisions about flight plans and fuel load. The system can also help fine-tune internal policies, such as measuring whether pilots are following instructions to reduce gas-guzzling takeoff thrust at 1,500 feet.

Gough says using FES to manage fuel can reduce annual fuel costs by as much as 2 percent. For example, in 2014, Southwest spent $5.27 billion to buy fuel, meaning the FES system could identify potential fuel-cost savings of up to $105 million each year.

Think about all that the next time you’re stuck on the tarmac waiting for a gate.

Listen to This! These “Intelligent” Street Lamps Can Hear Gun Shots, Call for Help

$
0
0
Sound Profile

The cities of San Diego and Jacksonville are testing an “intelligent lighting” system using sensor-enabled lamps connected to the Industrial Internet with the ability to monitor traffic, get severe weather warnings and even spot empty parking places. Crime could be next on the list.

GE Lighting, which designed the lighting system, and SST, Inc., a developer of gunfire detection and location technology, just signed a memorandum of understanding to embed SST’s ShotSpotter sensors and software as an option in GE’s suite of intelligent LED technology for cities. The acoustic sensors and software would give street lamps the ability to detect gunfire in real time, connect to 911, alert police patrol cars, and ping smartphones with the precise location of the shooting incident, the number of shooters and rounds fired and other valuable intelligence.

Sound Profile

ShotSpotter uses multiple collaborating sensors and software to geolocate a gunshot. Its algorithms can determine whether the noise was emitted by a firearm, a fireworks display, or a car backfiring. GIF credits: SST, Inc.

“We’ve entered an era where lighting is so much more than illumination,” says Rick Freeman, global general manager for intelligent devices at GE Lighting. “The ecosystem we are building with our Intelligent Environments for Cities solution is transforming street lighting into the analytical brain of urban life, and this MOU with ShotSpotter gives one more option for cities to unlock new potential benefits for their city teams and their residents.”

19963_GE_ICPartner_Visual_FA

SST’s acoustic sensors and GE software can give street lamps the ability to detect gunfire in real time, connect to 911 and alert police patrol cars. Image credit: GE Lighting

Only about one in 10 shooting incidents are reported to 911, according data from SST and the National Gunfire Index. Even when the call does come in, it’s often too late and the information is imprecise.

ShotSpotter sensors are already working in more than 90 cities, including New York City, Washington, D.C., and Sacramento, Calif. San Francisco, for example, reported a 50 percent decrease in recorded firearms violence since deploying ShotSpotter as part of their gun violence abatement strategy, says Ralph A. Clark, president and CEO of SST.

GEIntelligentCities_withcallouts[1]GE’s Intelligent Environments for Cities system gives municipalities the ability to monitor traffic, get severe weather warnings and even spot empty parking places. Image credit: GE Lighting

ShotSpotter uses multiple collaborating sensors and software to geolocate a gunshot. Its algorithms can determine whether the noise was emitted by a firearm, a fireworks display, or a car backfiring. Once validated, the sensor sends a real-time alert to dispatch centers, Public Safety Answer Points and directly to field personnel through any computer or mobile device with Internet access.

“ShotSpotter is a proven tool in helping cities across the country address gun violence issues,” says SST’s Clark. “This partnership with GE will accelerate the adoption of this technology in other cities by integrating our solution into existing infrastructure in a more comprehensive way.”

Joseph Savirimuthu: Shifting Winds for Transatlantic Data Flows

$
0
0
Satellite

A ruling in Europe has sent shockwaves across the Atlantic in the balance between data privacy and information sharing. Here are three key takeaways.

 

A crucial data protection agreement between the European Union and the US has been declared invalid in a move that could spell trouble for US-based internet giants such as Facebook, Google or Microsoft with huge business operations within the EU, and thousands of other US firms working in Europe.

The European Court of Justice (ECJ) has ruled invalid the Safe Harbour agreement, under which US firms transferring data between the US and EU member states were considered to offer sufficient safeguards to comply with European data protection law.

The court’s ruling follows what ECJ Advocate General Yves Bot, the European Commission and the European Parliament have previously acknowledged, and which has been emphasised by the US state surveillance revealed in documents leaked by Edward Snowden: safeguards on personal data in the US are lacking. In its ruling the ECJ stated that: “The United States … scheme enables interference, by United States public authorities, with the fundamental rights of persons”.

The case originates in a complaint brought to the Irish Data Protection Commissioner by the Austrian citizen Max Schrems against Facebook (based in Ireland). Schrems argued that Facebook should be prohibited from transferring personal data to the US due to the surveillance activities of the US government. The legal issues raised by the complaint needed to be addressed by the ECJ and, now that it has handed down its ruling, the case between Max Schrems and the Irish Data Protection Commissioner will come before the High Court.

 

How safe is Safe Harbour?

 

The EU Data Protection Directive prohibits the transfer of personal data to countries outside the EU that do not have an adequate level of data protection regulations in place. Until this ECJ ruling, some held the view that national data protection authorities (such as the Irish Data Protection Commissioner in this case) did not have the power to determine whether data protection in the US was adequate. In fact the Safe Harbour decision included a framework to enable US companies to self-certify compliance with EU data protection rules.

Of course, this case and others, and the Snowden revelations, highlight how inadequate compliance is. European Commission Vice President Viviane Reding noted in January 2014:

We kicked the tyres and saw that repairs are needed. For Safe Harbour to be fully roadworthy the US will have to service it. This summer, we will see how well those repairs were carried out. Safe Harbour has to be strengthened or it will be suspended.

In March 2014, the European Parliament called for a suspension of the Safe Harbour Agreement, while the Center for Digital Democracy filed a complaint with the US Federal Trade Commission (FTC) highlighting the widespread non-compliance of 30 US companies.

The ECJ ruling is significant in two respects. It reinforces that the right to privacy guaranteed by Articles 7 and 8 of the European Charter of Fundamental Rights should not be infringed when data is processed outside the EU. Privacy policies must state that users are effectively protected against the risk of abuse and or unlawful access of that data. It also implies that anyone who is concerned their personal data has been misused should have a legal or judicial remedy.

 

But Business Goes On

 

The ECJ ruling, which is effective immediately, carries three implications.

First, if the Safe Harbour agreement is invalid then data protection authorities can pursue US firms over privacy or data protection complaints. In fact the ECJ ruling essentially requires all member states to undertake an assessment of the adequacy of protections in the case of cross-border data transfers.

Second, cross-border data transfers, while not prohibited, will no longer be a tick-box exercise for self-compliance. It is anticipated that the Article 29 Working Party, which looks into personal data protection issues, may more closely examine US companies’ compliance with the latest guidance notes.

Third, looking to the future there is no doubt that the EU and US will need to review the current self-certification and self-regulation mechanisms, which will have to demonstrate that they’re transparent, proportionate and fair.

The ECJ ruling provides a clear reminder of the far-reaching significance of human rights legislation in European data protection law, and the role of the courts in upholding them. It won’t go down well in the US, where there is the view that it ignores the considerable reforms to how US intelligence agencies conduct their data gathering activities. The Privacy and Civil Liberties Oversight Board in its reassessment of US intelligence law, identified measures designed to deal with indiscriminate data gathering activities.

For now, it may appear that this is the end of business as usual for the Safe Harbour agreement, but in truth a single legal answer will not solve the complexities of ensuring data protection to European standards can be managed in other countries. A day for upholding fundamental human rights principles, but the hard practicalities lie ahead.

(Top image: Courtesy of Thinkstock)

This piece first appeared in The Conversation.

 

savirimuthuJoseph Savirimuthu is Senior Lecturer in Law, Director of Postgraduate Studies, University of Liverpool

 

The Odd Couple: Silicon and Carbon Don’t Love Each Other. But When They Iron Out Differences, Their Marriage Can Be Revolutionary

$
0
0
The Odd Couple: Silicon and Carbon Don’t Love Each Other. But When They Iron Out Differences, Their Marriage Can Be Revolutionary 0

Silicon and carbon are reluctant partners. Although the two elements are among the most abundant on Earth, they almost never bond in nature and it takes a lot of heat and pressure in the lab to coax them into working with each other. But when they do stick together and form a material called silicon carbide (SiC), it’s something to see.

“SiC is a key building block for next-generation devices,” says Danielle Merfeld, global technology director at GE Global Research. “It takes features from diamond, one of the toughest materials in the world, and combines them with features of silicon, our ubiquitous semiconductor technology in electronics, and takes the best of those to make a very new kind of material for power electronics. SiC can more efficiently handle higher voltages and three times the amount of energy compared to silicon chips. Suddenly, you can run everything from locomotives to planes and wind farms faster, hotter and more efficiently.”

For example, retrofitting a datacenter with SiC chips would make it 5 percent more efficient. The technology could allow aircraft makers to shed 1,000 pounds from a passenger jet, reduce the weight of a locomotive by 5 percent, cut power losses inside wind and solar inverters by half, and make hybrid electric vehicles consume 10 percent less fuel.

GE – together with the SUNY Polytechnic Institute’s Colleges of Nanoscale Science and Engineering, the New York State and other industry partners – is now building a new SiC foundry  in Utica, N.Y. The company, which has been working on SiC for almost 25 years, also contributed intellectual property valued at $100 million to get the project on its way.

 

GE Reports recently visited Merfeld’s GRC lab. Take a look.

imageGE started working with SiC in 1991. The first product was a hardy UV photodiode that could withstand the infernal heat inside a gas turbine and monitor the flame. The latest products include power electronics that can handle the hot and harsh environment inside oil wells, aboard planes and ships, and under the hood of hybrid electric cars.

Manufacturing a silicon carbide chip requires over 200 steps performed in a clean room, and companies have to negotiate pitfalls opened by the complicated interactions between silicon, carbon and metal oxides. The images above and in the top GIF show one of the first fabrication steps: using spin coating to deposit a thin layer of photoresist that allows workers to pattern the wafer.

imageThe very first SiC application a century ago was as an abrasive. Today, the material’s hardy constitution gives SiC chips excellent reliability and potential lifespans of 100 years. This comes handy in deep-sea oil wells or offshore wind turbines, which must perform without a hitch for long periods of time.

 

imageThe New York foundry will manufacture SiC chips on 6-inch wafers with more than double the wafer area, compared to standard 4-inch technology. The larger size will help cut costs and scale production. The GIFs above show microscopes inspecting chips on an SiC wafer. 

imageGE makes SiC chips called MOSFETs – metal-oxide semiconductor field effect transistors. They help manage power inside machines and can handle temperatures as high as 200 degrees Celsius (392 Fahrenheit), where ordinary silicon would fail. Above is a partially processed 6-inch wafer in the early stages of patterning. 

imageAn SiC wafer like this one can hold hundreds of chips. They can handle over 1,000 volts and up to 100 amps. The picture above shows a finished wafer with individual MOSFET chips.

imageA close up view of a power module – in teal – showing three SiC MOSFETs with their power and signaling connections. The module is a key building block for power electronics systems. It processes raw electrical power into clean sine waves that customers can use.

imageimageGE engineers are working with two flavors of SiC power modules. Unlike the teal power module with its silver wire bond connections, the yellow module above has no wire bonds. “The SiC MOSFET is a high-performance device,” says Ljubisa Stevanovic, advanced technology leader at GE Global Research. “In an automotive analogy, it’s a racecar engine that’s much faster than conventional devices. The teal module shows the sturdy, conventional design that’s built like a pickup truck. But the yellow module above has no wire bonds. It’s like a Formula 1 chassis. It really allows us to go fast.”

image“In applications requiring high speed we use SiC modules without wire bonds, which deliver amazing performance,” Stevanovic says. The golden rectangles are transistors and the silver squares are diodes. 

imageThis SiC Power Block sports several SiC modules, including electronic controls, energy storage and cooling components. Engineers can use the device to manage power for a wide range of applications. “The SiC Power Block delivers maximum power in the smallest, most efficient package and this self-contained and fully optimized unit can manage power inside wind and solar farms, locomotives, datacenters and many other applications,” Stevanovic says. “This is the perfect example of the GE Store. The standard power block can be applied for different systems across GE. It makes the application engineer’s job easier. They don’t have to reinvent the wheel every time they want to use SiC.”

Sink and Swim: Stinger The Swimming Robot Keeps Nuclear Reactors Healthy

$
0
0
Sink and Swim: Stinger The Swimming Robot Keeps Nuclear Reactors Healthy 2

Nothing says summertime in Georgia like a dip in the old swimming hole. But near the town of Baxley, there’s one pool that’s not open to the public: the crystal-clear blue waters of the containment vessel bathing the Edwin Irby Hatch Power Plant’s nuclear reactor.

Although this is no place for a swim, the vessel must be monitored. In the past, during scheduled maintenance and refueling downtimes, multiple inspectors would clamber onto platforms that extended above the pool and plunge into the vessel cameras mounted on poles or tied to ropes. Using such handheld devices allowed them to get a close-up view of the welds in the reactor pressure vessel and also surfaces that had to be kept in perfect order. It worked, but it was a slow process and the inspectors had to protect themselves from radiation from the fuel below.

But now the team has a new member that’s happy to jump in. He’s called Stinger, the swimming robot, and allows nuclear plant personnel to go places no human could reach before.

imageimage

Stinger, which is a bit taller than an average human, is a steerable unmanned underwater vehicle. He comes equipped with multidirectional, computer-controlled thrusters and a high-resolution color video camera. “Where Stinger’s camera and tools need to operate, a human could not survive in that location,” says Brandon Smith, the GE Hitachi Nuclear Energy mechanical engineer who led development of the machine. “That’s exactly why we need robotics to do this kind of work. There’s just no other way to do it and Stinger is built specifically to operate in that type of environment.”

The first-of-its-kind remote-operated vehicle is now being deployed to nuclear power plants across the U.S. as they go through scheduled refueling and inspection outages. It dives in the reactor pool for up to three weeks, using cameras to get a good look at material degradation. In addition to its camera technology, it also carries a high-pressure water nozzle, or hydrolaser, to clean metallic surfaces to ensure a good, clean look at the welds.

image

 

Smith and his team gave the robot a tungsten frock for protection and electronics that are less susceptible to high radiation levels. A single technician can operate it remotely from a tent hundreds of feet away from the vessel. The robot is connected through a power and control umbilical to the on-site operator and off-site, certified inspectors.

“The technician uses Stinger’s cameras to look for signs of degradation,” says Smith. “It’s really good at getting into constricted spaces and around tight corners to look for any sign that a component might fail—to catch something before it becomes a problem.”

Sink and Swim: Stinger The Swimming Robot Keeps Nuclear Reactors Healthy 0

Stinger is now successfully performing inspections throughout the U.S. nuclear industry. Smith says customers like it because it can work longer with fewer concerns about radiation exposure. It can also perform its job while other outage operations are going on since it doesn’t need inspectors to be suspended above the reactor pool. These benefits translate to shorter plant downtime and lower safety risks to employees.

“Our customers make money when their plants are producing power, so they’re always trying to reduce the length of necessary downtime,” says Smith. “They’re also always looking to reduce radiation exposure to workers. By moving Stinger in and workers away to a lower dose area, they are able to accomplish both goals.”

Image credits: GE Hitachi Nuclear Energy

Kurt’s Cradle: Kurt Vonnegut was GE’s PR Man Before Becoming a Bestselling Author

$
0
0
Kurt’s Cradle: Kurt Vonnegut was GE’s PR Man Before Becoming a Bestselling Author 0

Before Kurt Vonnegut Jr. wrote the bestsellers Slaughterhouse Five and Cat’s Cradle, he lived near Schenectady, New York, and worked as a GE publicist. According to Vonnegut’s biographer Charles J. Shields, he was hired in 1947 as part of GE’s drive “to get some real journalists on board to hunt for stories at the Schenectady Works and keep a steady drumbeat of good news issuing from the plant.” Writes Shields:

“Thus it happened that Kurt received a call in late August from George W. Griffin Jr., a General Electric public relations executive. Griffin explained that [Vonnegut’s brother] Bernard had recommended his younger brother as the kind of man they might be looking for: someone with a science background who was also a reporter. Would he be interested in interviewing for a job in Schenectady?”

Vonnegut told the scholar Robert K. Musil that “the job required my visiting the scientists and talking to them and asking them what they were up to. Every so often a good story would come out of it.”

Vonnegut said that the plant in Schenectady inspired him write his first novel, Player Piano. Shields writes that after Vonnegut submitted the first draft, Kurt asked his editor a favor and refrain from touting the book, which deals with a dystopian world run by machines, “as a satire of one of the world’s largest corporations,” or Bernard’s career “might suffer through guilt by association.”

image

Kurt Vonnegut, third from left, is taking notes during a VIP tour of GE’s Schenectady plant. The photo was preserved by Mary Robinson. Her father, Ollie M. Lyon Jr., was Vonnegut’s close friend and colleague in the GE PR department. He is the man in the center in the three-piece suit. Image credit: Mary Robinson

It did not. In fact, Vonnegut based a lead character in the novel Cat’s Cradle on Dr. Irving Langmuir, Bernard’s boss at GE and the first scientist in private industry to win a Nobel Prize. Langmuir and Bernard Vonnegut worked on techniques designed to control the weather and the team succeeded in starting artificial rainfall and snowfall.

Cat’s Cradle’s central plot device is a form of frozen water called Ice-9, which stays solid even at warm temperatures. Vonnegut told George Plimpton, who interviewed him for the Paris Review, that the idea for the substance came from Langmuir. He said that when the writer H.G. Wells visited Schenectady, “Langmuir thought he might entertain Wells with an idea for a science-fiction – about a form of ice that was stable at room temperature. Wells was uninterested, or at least never used the idea. And then Wells died, and then, finally, Langmuir died. I thought to myself: ‘Finders, keepers – the idea is mine.”

Vonnegut, who stayed at GE until 1950, died in 2007.


GE’s Third-Quarter Results Highlight Pace of Change, Focus on Industrial Core

$
0
0
DPPx3

GE’s third-quarter results, released today, put in focus the rapid change taking place at the company as it races to sell most of its financial businesses and embrace its industrial core and software. GE’s transformation into a digital industrial powerhouse has gained speed with the accelerated sales of GE Capital assets and regulatory approvals for the acquisition of Alstom’s power and grid businesses – GE’s largest industrial acquisition ever.

GE beat analyst estimates, reporting $0.29 per share in operating earnings. Industrial revenues on an organic basis (without the effects of currencies or M&A) rose 4 percent. The company also highlighted progress on its cost initiatives, growing margins by 100 basis points and gross margins by 80 basis points.

EngineGIF

GE has been transforming itself into the world’s largest digital industrial company and connecting machines to the Industrial Internet. Image credit: GE Reports

GE also announced today that it was launching the Synchrony Financial share exchange next week, a major milestone in its journey to separate from Synchrony, the largest provider of private label credit cards in the United States.

The Synchrony news comes on the heels of a GE Capital announcement this week to sell several of its commercial lending businesses for approximately $30 billion to Wells Fargo. This is the largest single sale GE has made since it disclosed its plans to exit most of GE Capital on April 10th. Since the beginning of the year, GE has announced the sale of more than $126 billion of GE Capital assets.

GE_Brilliant_OilgasC

“GE is executing and is on track to deliver on its 2015 goals,” said GE Chairman and CEO Jeff Immelt. “Our portfolio transformation is happening at an unprecedented pace. We have a focused infrastructure business with leading capabilities in our markets.”

Adding to those capabilities will be Alstom’s power and grid businesses, which have been cleared for acquisition by regulators in Europe. GE said today it expects that deal to close “within weeks.” The acquisition, GE’s largest-ever industrial deal, will grow its power generation installed base by 50 percent.

Additionally, GE reported today $27.9 billion in revenues from its industrial businesses and verticals. The company also reported $6.5 billion in cash from operating activities on a year-to-date basis. Within the industrial division, 5 out of 7 segments reported growth in earnings. The Company currently has a $199 billion backlog of services, positioning it well for future business cycles.

GE has been also building up its services business and expanding its software offerings for the Industrial Internet. At the GE Minds + Machines annual event last month, the company said that software and solutions were set to deliver $15 billion in revenue by 2020. Said Immelt: “We are transforming GE into the world’s premier Digital Industrial company, in a unique position to drive outcomes for customers and grow margins.”

The Road to ecoROTR: How Building a Better Wind Turbine Began With an Online Shopping Spree for Styrofoam Balls

$
0
0
EcoRoutside

Scientists at GE Global Research spent the last four years building a more efficient wind turbine. It rises 450-feet above the Mojave desert in California – almost half the height of the Eiffel Tower — and seems to have a silver UFO stuck to its face. The turbine may appear strange, but you are looking at the future of wind power. Here’s how it came about.

In 2011, Mark Little, GE’s chief technology officer and the head of GE Global Research (GRC) at the time, challenged principal engineer Seyed Saddoughi and his team to build a rotor that could harvest more wind. Michael Idelchik, who runs advanced technology programs at the GRC, gave them another clue: “Since we know that the inner parts of wind turbines don’t do much for energy capture, why don’t we change the design?”

image

The team came up with the idea of putting a hemisphere on the center part of the wind turbine to redirect the incoming wind towards the outer parts of the blades. “The biggest unknown for us was what size the dome should be,” Saddoughi says. The group decided to do some experiments. They bought on the Internet a 10-inch wind turbine and a bunch of Styrofoam balls of different sizes, then took the lot to a wind tunnel at GE’s aerodynamic lab (see above). “By cutting the Styrofoam balls in half, we created our domes of different sizes and then stuck these domes on the center of the small wind turbine and ran our experiments at different tunnel air speeds,” Saddoughi says.  All image credits: GE Global Research

image

The team hooked up the turbine to their instruments and measured the amount of voltage it produced. “Invariably we got a jump in voltage output with the dome placed at the center of the wind turbine; albeit the increases differed for different size domes,” Saddoughi says. The scientists reached out to a colleague who did simple computer simulations for them and confirmed that even a full-size turbine was more efficient with a nose upfront. “Of course we were overjoyed by the very limited experimental and computational results,” Saddoughi says. “We wanted to come up with a name for this design, such that it really represented the idea – and was also something that everybody would remember easily. The team gathered in my office again, and after an hour of playing with words the name Energy Capture Optimization by Revolutionary Onboard Turbine Reshape (ecoROTR) was created.”

image

The team then built a 2-meter rotor model of the turbine and took it for testing to a large wind tunnel in Stuttgart, Germany.

image

The tunnel was 6.3 meters in diameters and it allowed them to dramatically reduce the wall effects on the performance.

image

The researchers spent couple of months working in Stuttgart. “We conducted a significant number of experiments at the Gust wind tunnel for different tunnel air velocities and wind turbine tip-speed ratios with several variations of domes,” Saddoughi says.

image

“The wind tunnel was also operated at its maximum speed for the blades in feathered configurations at several yaw angles of the turbine to simulate gust conditions,” Saddoughi says. They ran the turbine as fast as 1,000 rpm and carried out surface dye flow visualization experiments (see below).

image

When dye hits the fan: Saddoughi after the dye flow visualization. When the team came back in the second half on 2012, they started designing the actual prototype of the dome that was 20 meters in diameter and weighed 20 tons. The size presented a new batch of challenges. “Unlike gas or steam turbines that are designed to operate under a relatively limited number of set conditions, wind turbines must operate reliably and safely under literally hundreds of conditions, many of them highly transient,” says Norman Turnquist, senior principal engineer for aero thermal and mechanical systems.

image

They ran more calculations to make sure that GE’s 1.7-megawatt test turbine in Tehachapi, Calif., would be able to support the dome. They looked at performance during different wind speed and directions, storms and gusts. They also designed special mounting adapters and brackets to attach the dome. “The design looked really strange, but it made a lot of sense,” says Mike Bowman, the leader of sustainable energy projects at GE Global Research. The team then assembled the dome on site. “Early on, it was decided that the prototype dome would be a geodesic construction,” Turnquist says. “The reason is simply that it was the construction method that required the least amount of unknown risk.”

image

For safety reasons, the workers assembled the dome about 300m from the turbine and used a giant crane to move it to the turbine base for installation. But there was a hitch. “After the adapters were mounted to the hub it was discovered that bolt circle diameter was approximately 8mm too small to fit the dome,” Turnquist says. The team had to make custom shims to make it work.

image

The dome went up in May on Memorial Day and the turbine is currently powering through four months of testing. “This is the pinnacle of wind power,” says Mike Bowman. “As far as I know, there’s nothing like this in the world. This could be a game changer.“

imageimage

The First American Jet Engine Was Born Inside a Power Plant: A GE Store Story

$
0
0
The First American Jet Engine Was Born Inside a Power Plant: A GE Store Story 0

For most people, Thomas Edison is the man who came up with the first practical light bulb. But Edison was also an inveterate entrepreneur who parlayed his patents into new industries and enduring businesses. Take GE, the result of an 1892 merger between his Edison Electric Co. and Thomson-Houston Electric Co. It has since grown into an industrial giant with $148 billion in annual revenues making everything from MRI scanners to gas turbines and jet engines.

Although these businesses may seem very different, they often trace their origin to a his lab and a point in history where Edison, light and electricity intersect. The light bulb led him into X-rays and the medical imaging business, and GE’s expertise in power generation and gas turbine engineering gave birth to the company’s aviation business. (This sharing is a two-way street. Aviation engineers are now helping their colleagues in power generation help build more efficient gas turbines with their jet engine know-how.)

It’s in part because of these synergies – GE executives call this cross-pollination “the GE store” – that GE Power & Water and GE Aviation alone produced a combined $50 billion in revenues in 2014, more than a third of the company’s total. Take a look at their intertwined history.

image

Edison’s light bulb and the wave of electric devices that followed created a huge demand for electricity. Initially, companies were using piston engines to power generators, but they quickly switched to more efficient steam turbines. In 1903, GE engineers Charles Curtis and William Emmet built what was then the world’s most powerful steam turbine generator for a power plant in Newport, R.I. (see above). It required one-tenth the space and cost two-thirds less than the equivalent piston engine generator. Top image: A GEnx engine for the Dreamliner. GE Aviation has its roots inside a power plant. Image credit: Adam Senatori/GE Reports

 

 

image

It was also in 1903 that GE hired young turbine engineer Sanford Moss (above). Moss had just received a doctorate in gas turbine research from Cornell University. At GE, he started building a revolutionary radial gas compressor using the centrifugal force to squeeze the air before it enters the gas turbine – the same force pushing riders up into the air on a swing carousel. Moss’s early experiments failed; his machine guzzled too much fuel and produced too little power. But his patent and his revolutionary compressor design were sound and found many applications: from supplying air to blast furnaces to powering pneumatic tube systems. He didn’t know it, but he had pointed the way to the jet engine before the Wright Brothers even took off.

 

image

In November 1917 – at the peak of World War I – GE President E.W. Rice received a note from National Advisory Committee for Aeronautics, the predecessor of NASA, asking about Moss’s radial compressor. WWI was the first conflict that involved planes and the agency wanted Moss to improve the performance of the Liberty aircraft engine. The engine was rated 354 horsepower at sea level, but its output dropped by half in thin air at high altitudes. Moss (right in the picure above) believed that he could use his compressor to squeeze the air before it enters the engine, making it denser and recovering the engine’s lost power.

image

Using a mechanical device to fill the cylinders of a piston engine with more air than it would typically ingest is called supercharging. Moss designed a turbosupercharger that used the hot exhaust coming from the Liberty engine to spin his radial turbine and squeeze the air coming inside the engine. In 1918, when he tested the design at 14,000 feet on top of Pike’s Peak, Colo. The engine delivered 352 horsepower, essentially its rated sea level output, and GE entered the aviation business.

image

The first Le Pere biplane powered by a turbosupercharged Liberty engine took off on July 12, 1919. “The General Electric superchargers thus far constructed have been designed to give sea-level absolute pressure at an altitude of 18,000 feet, which involves a compressor that doubles the absolute pressure of the air,” Moss wrote.

image

Planes equipped with Moss’s turbosupercharger set several world altitude records.

 

image

In 1937, on the eve of World War II, GE received a large order from the Army Air Corps to build turbosuperchargers for Boeing B-17 and Consolidated B-24 bombers, P-38 fighter planes, Republic P-47 Thunderbolts, and other planes. GE opened a dedicated Supercharger Department at Lynn, Mass. In 1939, Moss proposed to build one of the first turboprop engines. Trained as a gas turbine engineer, he later joined the National Aviation Hall of Fame.

image

 

image

 

image

But GE’s aviation business was just getting started. In 1941, the U.S. government asked GE to bring to production one of the first jet engines developed in England by Sir Frank Whittle. (He was knighted for his feat.) A group of GE engineers called the Hush Hush Boys designed new parts for the engine, redesigned others, tested it and delivered a top-secret working prototype called I-A. On October 1, 1942, the first American jet plane, the Bell XP-59A, took off from Lake Muroc in California for a short flight. The jet age in the U.S. had begun. The demand for the first jet engines, called J33 and J35, was so high that GE had a hard a time meeting production numbers and the Army outsourced manufacturing to General Motors and Allison.

image

GE decided to double down and invest in more jet engine research. The J33 and J35 engines used a radial – also called centrifugal – turbine to compress air, similar to the design that Moss developed for his turbosuperchargers. But GE engineers started working on an engine with an axial turbine that pushed air through the engine along its axis. (All jet engines use this design today.) The result was the J47 jet engine that powered everything from fighter jets like the F-86 Sabre to the giant Convair B-36 strategic bombers. GE made 35,000 J47 engines, making it the most produced jet engine in history.

image

The J47 also found several off-label applications. The Spirit of America jet car used one, and a pair of them powered what is still the world’s fastest jet-propelled train. They also served on the railroad as heavy-duty snow blowers. In 1948, GE hired German war refugee and aviation pioneer Gerhard Neumann, who quickly went to work on improving the jet engine. He came up with a revolutionary innovation called the variable stator. It allowed pilots to change the pressure inside the turbine and make planes routinely fly faster than the speed of sound. When GE started testing the first jet engine with Neumann’s variable stator, the J79 (see below), engineers thought that their instruments were malfunctioning because of the amount of power it produced. In the 1960s, a GE-powered XB-70 Valkyrie aircraft was flying in excess of Mach 3, three times the speed of sound.

 

image

The improved performance made the aviation engineers realize that their variable vanes and other design innovations could also make power plants more efficient. Converting the engines for land use wasn’t difficult. In 1959, they turned a T58 helicopter engine into a turbine that produced 1,000 horsepower and could be used for generating electricity on land and on boats. A similar machine built around the J79 jet generated 15,000 horsepower. In Cincinnati, where GE Aviation moved from Lynn in teh 1950s, the local utility built a ring of 10 J79 jet engines to power a big electricity generator.

 

image

The first major application of such turbines, which GE calls “aeroderivatives” because of their aviation heritage, was as power plants for the Navy’s 76,000-ton Spruance-class destroyers. The turbines now also power the world’s fastest passenger ship, Francisco. It can carry 1,000 passengers, 150 cars and travel at 58 knots.

image

Today, there are thousands of aeroderivatives working all over the world. Most recently, they have been helping Egypt’s growing economy slake its thirst for electricity.

image

Neumann’s variable vanes (above) are also part of GE’s most advanced gas turbine: the 9HA Harriet, the world’s largest, most powerful and most efficient gas turbine. Two of them can generate the same amount of power as a small nuclear power plant.

image

At the same time, GE Aviation is working on the next-generation jet engine called ADVENT, or Adaptive Versatile Engine Technology (above). “To put it simply, the adaptive cycle engine is a new architecture that takes the best of a commercial engine and combines it with the best of a fighter engine,” says Jed Cox, who leads the ADVENT project for the U.S. Air Force Research Lab.

 

Debate: Is Africa Still Rising?

GE Announces Synchrony Financial Exchange Offer

$
0
0
credit cards stack on white

GE (NYSE: GE) announced today its offer to exchange GE company stock for shares in Synchrony Financial (NYSE: SYF), the largest provider of private label credit cards in the United States.*

Shareholders who participate in the exchange offer will receive a certain amount of Synchrony common stock for each share of GE common stock accepted in the exchange offer. The exact ratio will be determined shortly before the exchange offer expires in mid-November. GE shares tendered and accepted for exchange will reduce the outstanding shares of GE. Further details of the exchange are set forth within the Prospectus on the Exchange Offer website.

JPEGFINAL - GE_Synchrony_Exchange_Infographic copy 2Synchrony Financial offers private label and co-branded credit cards, promotional financing and installment lending, loyalty programs and FDIC-insured savings products through Synchrony Bank. Synchrony traces its roots back to 1932, and it is the holding company for the entities that historically conducted GE’s North American retail finance business. Last July, Synchrony launched and successfully executed the third-largest initial public offering of 2014. Last week, Synchrony received approval from the Federal Reserve Board to become a standalone savings and loan holding company following completion of the exchange offer.

The Synchrony Financial separation is consistent with GE’s strategy of reducing the size of its financial businesses and focusing on its industrial core. GE believes that the separation will allow Synchrony to operate as a standalone company and pursue a long-term strategy that is focused only on its own business objectives.

“The Synchrony exchange is an important part of GE’s transformation into a simpler, more focused company,” said GE Chairman and CEO Jeff Immelt. “We expect the exchange will reduce the outstanding float of GE common stock by 6 percent to 7 percent, and if fully subscribed, would represent the equivalent of about $18 billion – $21 billion in GE stock buyback, subject to the relative performance of GE and Synchrony stock prices. With the launch of today’s exchange offer and progress to-date on the GE Capital Exit Plan, we are on track to return more than $90 billion to investors from 2015 to 2018 with more than 90 percent of our earnings coming from high-return industrial businesses.”

*Source: The Nilson Report (April, 2015, Issue # 1062) – based on 2014 data.

Additional Information and Where to Find It:

This document is for informational purposes only and is neither an offer to sell or the solicitation of an offer to buy any securities nor a recommendation as to whether investors should participate in the exchange offer. Synchrony will file with the Securities and Exchange Commission (the “SEC”) a registration statement on Form S-4 that will include a Prospectus and GE will file with the SEC a Schedule TO, which will more fully describe the terms and conditions of the exchange offer.  The exchange offer will be made solely by the Prospectus. The Prospectus will contain important information about the exchange offer, GE, Synchrony and related matters, and GE will deliver the Prospectus to holders of GE common stock. INVESTORS AND SECURITY HOLDERS ARE URGED TO READ THE PROSPECTUS, AND ANY OTHER RELEVANT DOCUMENTS FILED WITH THE SEC, WHEN THEY BECOME AVAILABLE AND BEFORE MAKING ANY INVESTMENT DECISION, BECAUSE THEY CONTAIN IMPORTANT INFORMATION. None of GE, Synchrony or any of their respective directors or officers or the dealer managers appointed with respect to the exchange offer makes any recommendation as to whether you should participate in the exchange offer.

Holders of GE common may obtain copies of the Prospectus, other related documents, and any other information that GE and Synchrony file electronically with the SEC free of charge at the SEC’s website at http://www.sec.gov. Holders of GE common stock will also be able to obtain a copy of the Prospectus by clicking on the appropriate link on this website. Alternatively, Georgeson Inc., the information agent for the exchange offer, will, upon request, arrange to send the Prospectus to holders of GE common stock who call (866) 300-8594 (toll-free in the United States) or (781) 575-2173 (international).

Forward-Looking Statements

This document contains “forward-looking statements” – that is, statements related to future, not past, events.  In this context, forward-looking statements often address our expected future business and financial performance and financial condition, and often contain words such as “expect,” “anticipate,” “intend,” “plan,” “believe,” “seek,” “see,” “will,” “would,” or “target.”

Forward-looking statements by their nature address matters that are, to different degrees, uncertain, such as statements about our announced plan to reduce the size of our financial services businesses, including expected cash and non-cash charges associated with this plan; expected income; earnings per share; revenues; organic growth; margins; cost structure; restructuring charges; cash flows; return on capital; capital expenditures, capital allocation or capital structure; dividends; and the split between Industrial and GE Capital earnings.

For us, particular uncertainties that could cause our actual results to be materially different than those expressed in our forward-looking statements include:

  • failure to consummate the exchange offer;
  • obtaining (or the timing of obtaining) any required regulatory reviews or approvals or any other consents or approvals associated with our announced plan to reduce the size of our financial services businesses;
  • our ability to complete incremental asset sales as part of that plan in a timely manner (or at all) and at the prices we have assumed;
  • changes in law, economic and financial conditions, including interest and exchange rate volatility, commodity and equity prices and the value of financial assets, including the impact of these conditions on our ability to sell or the value of incremental assets to be sold as part of our announced plan to reduce the size of our financial services businesses as well as other aspects of that plan;
  • the impact of conditions in the financial and credit markets on the availability and cost of GECC’s funding, and GECC’s exposure to counterparties;
  • the impact of conditions in the housing market and unemployment rates on the level of commercial and consumer credit defaults;
  • pending and future mortgage loan repurchase claims and other litigation claims in connection with WMC, which may affect our estimates of liability, including possible loss estimates;
  • our ability to maintain our current credit rating and the impact on our funding costs and competitive position if we do not do so;
  • the adequacy of our cash flows and earnings and other conditions, which may affect our ability to pay our quarterly dividend at the planned level or to repurchase shares at planned levels;
  • GECC’s ability to pay dividends to GE at the planned level, which may be affected by GECC’s cash flows and earnings, financial services regulation and oversight, and other factors;
  • our ability to convert pre-order commitments/wins into orders;
  • the price we realize on orders since commitments/wins are stated at list prices;
  • customer actions or developments such as early aircraft retirements or reduced energy demand and other factors that may affect the level of demand and financial performance of the major industries and customers we serve;
  • the effectiveness of our risk management framework;
  • the impact of regulation and regulatory, investigative and legal proceedings and legal compliance risks, including the impact of financial services regulation and litigation;
  • adverse market conditions, timing of and ability to obtain required bank regulatory approvals, or other factors relating to us or Synchrony Financial that could prevent us from completing the Synchrony Financial split-off as planned;
  • our capital allocation plans, as such plans may change including with respect to the timing and size of share repurchases, acquisitions, joint ventures, dispositions and other strategic actions;
  • our success in completing, including obtaining regulatory approvals for, announced transactions, such as the proposed transactions and alliances with Alstom, Appliances and our announced plan to reduce the size of our financial services businesses, and our ability to realize anticipated earnings and savings;
  • our success in integrating acquired businesses and operating joint ventures;
  • the impact of potential information technology or data security breaches; and
  • the other factors that are described in “Risk Factors” in our Annual Report on Form 10-K for the year ended December 31, 2014.

These or other uncertainties may cause our actual future results to be materially different than those expressed in our forward-looking statements.  We do not undertake to update our forward-looking statements.

Viewing all 2658 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>