Scientists have for the first time produced human embryonic stem cells from adult cells through a cloning process.
The success by a team at Oregon Health and Science University puts human “therapeutic cloning” back on the scientific agenda as a potential source of stem cells for regenerative medicine, after a few years in which attention focused on other methods that seemed easier to achieve.
The research may also revive fears about the birth of human clones, though the Oregon scientists insist that their work could not be used for this purpose.
“Our finding offers new ways of generating stem cells for patients with dysfunctional or damaged tissues and organs,” says Shoukhrat Mitalipov, senior author of the study published in the journal Cell. “Such stem cells can regenerate and replace those damaged cells and tissues and alleviate diseases that affect millions of people.”
The Oregon scientists followed the “nuclear transfer” approach first proposed in the late 1990s, after the cloning of Dolly the sheep, as a source of stem cells that would be genetically identical to the patient. The nucleus of an adult skin cell is transferred to a human egg whose own nucleus has been removed and, following biochemical treatment and an electric shock, it starts growing into an embryo.
Intensive research in several laboratories produced partial success at best, but no one could generate human embryos good enough to be a source of stem cells. So attention switched to an alternative technique for making patient-specific cells – known as “induced pluripotent stem cells” or iPSCs.
iPSCs are generated directly from the patient’s skin cells by adding a biochemical and genetic cocktail that turns their developmental clock back to an embryonic state. Although this sounds simpler than therapeutic cloning, concerns that the process might cause undesirable mutations in iPSCs mean that scientists are keen to find other ways to generate embryonic stem cells.
The Oregon team persisted with therapeutic cloning research and eventually succeeded. Coffee is an essential feature of lab life – and the vital ingredient in their procedure turned out to be caffeine.
“It is remarkable that adding caffeine was the key that resulted in embryonic stem cell lines from all three [egg] donors,” commented Alison Murdoch, professor of reproductive medicine at Newcastle University in the UK where scientists have carried out similar research.
Although the success of the Oregon experiment may again raise fears of therapeutic cloning technology being used for reproductive cloning, the scientists say this would not work in practice. Several years of monkey studies suggest that the embryos are good enough to generate stem cells but not to implant into a womb and grow into a healthy baby clone.
“Our research is directed towards generating stem cells for use in future treatments to combat disease,” said Professor Mitalipov. “While nuclear transfer breakthroughs often lead to a public discussion about the ethics of human cloning, this is not our focus, nor do we believe our findings might be used by others to advance the possibility of human reproductive cloning.”
If you’ve been following Google I/O, Google’s annual developer conference, and you are an Android developer (or plan to be one), then you couldn’t have missed this awesome announcement.
Android Studio is a new IDE for Android development that Google is developing in cooperation with JetBrains, based on the IntelliJ Platform and the existing functionality of IntelliJ IDEA Community Edition.
This was a popular announcement, as the crowd “ooh’d” and “ahh’d” as screenshots were shown on stage.
This tool has more options for Android Development, making the process faster and more productive. A “live layout” was shown that renders your app as you’re editing in realtime.
Additionally, you can switch over to different layouts and screen sizes, such as 3.7 inch phone and 10-inch tablet. The team noted that this might be useful for internationalization, allowing you to quickly see what things look like without having to package up your app and install it on a device.
The company says that it has “big plans” for Android Studio.
Higher developer productivity, code beauty and excellent out-of-the-box experience are the concepts behind IntelliJ IDEA that make it really stand out. This is a significant addition to the tools and APIs offered by Google.
If you’re curious about what will happen with IntelliJ IDEA, no worries, IntelliJ IDEA Community Edition will remain a free and open Java IDE with full Android support and will include the new features developed by both the Google and JetBrains teams. The features will of course be available in IntelliJ IDEA Ultimate too.
By the way, JetBrains has just opened the Early Access Preview of IntelliJ IDEA v13 so you can start trying and providing your feedback. This EAP build includes all of the new features of Android Studio except for the new project wizard and the AppEngine cloud endpoints integration. These latter features will also appear in our EAP builds in the coming weeks.
Last summer, in a Harvard robotics laboratory, an insect took flight. Half the size of a paper clip, weighing less than a tenth of a gram, it leapt a few inches, hovered for a moment on fragile, flapping wings, and then sped along a preset route through the air.
Like a proud parent watching a child take its first steps, graduate student Pakpong Chirarattananon immediately captured a video of the fledgling and emailed it to his adviser and colleagues at 3 a.m. — subject line: “Flight of the RoboBee.”
“I was so excited, I couldn’t sleep,” recalls Chirarattananon, co-lead author of a paper published this week in Science.
The demonstration of the first controlled flight of an insect-sized robot is the culmination of more than a decade’s work, led by researchers at the Harvard School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering at Harvard.
“This is what I have been trying to do for literally the last 12 years,” saysRobert J. Wood, Charles River Professor of Engineering and Applied Sciences at SEAS, Wyss core faculty member, and principal investigator of the National Science Foundation-supported RoboBee project. “It’s really only because of this lab’s recent breakthroughs in manufacturing, materials, and design that we have even been able to try this. And it just worked, spectacularly well.”
I want to create something the world has never seen before. It’s about the excitement of pushing the limits of what we think we can do, the limits of human ingenuity.” Kevin Y. Ma, a graduate student at SEAS
Inspired by the biology of a fly, with submillimeter-scale anatomy and two wafer-thin wings that flap almost invisibly, 120 times per second, the tiny device not only represents the absolute cutting edge of micromanufacturing and control systems, but is an aspiration that has impelled innovation in these fields by dozens of researchers across Harvard for years.
“We had to develop solutions from scratch, for everything,” explains Wood. “We would get one component working, but when we moved onto the next, five new problems would arise. It was a moving target.”
Flight muscles, for instance, don’t come prepackaged for robots the size of a fingertip.
“Large robots can run on electromagnetic motors, but at this small scale you have to come up with an alternative, and there wasn’t one,” says co-lead author Kevin Y. Ma, a graduate student at SEAS.
The tiny robot flaps its wings with piezoelectric actuators — strips of ceramic that expand and contract when an electric field is applied. Thin hinges of plastic embedded within the carbon fiber body frame serve as joints, and a delicately balanced control system commands the rotational motions in the flapping-wing robot, with each wing controlled independently in real time.
At tiny scales, small changes in airflow can have an outsized effect on flight dynamics, and the control system has to react that much faster to remain stable.
The robotic insects also take advantage of an ingenious pop-up manufacturing technique that was developed by Wood’s team in 2011. Sheets of various laser-cut materials are layered and sandwiched together into a thin, flat plate that folds up like a child’s pop-up book into the complete electromechanical structure.
The quick, step-by-step process replaces what used to be a painstaking manual art and allows Wood’s team to use more robust materials in new combinations, while improving the overall precision of each device.
“We can now very rapidly build reliable prototypes, which allows us to be more aggressive in how we test them,” says Ma, adding that the team has gone through 20 prototypes in just the past six months.
Applications of the RoboBee project could include distributed environmental monitoring, search-and-rescue operations, or assistance with crop pollination, but the materials, fabrication techniques, and components that emerge along the way might prove to be even more significant. For example, the pop-up manufacturing process could enable a new class of complex medical devices. Harvard’s Office of Technology Development, in collaboration with Harvard SEAS and the Wyss Institute, is already in the process of commercializing some of the underlying technologies.
“Harnessing biology to solve real-world problems is what the Wyss Institute is all about,” says Wyss Founding Director Don Ingber. “This work is a beautiful example of how bringing together scientists and engineers from multiple disciplines to carry out research inspired by nature and focused on translation can lead to major technical breakthroughs.”
And the project continues.
“Now that we’ve got this unique platform, there are dozens of tests that we’re starting to do, including more aggressive control maneuvers and landing,” says Wood.
After that, the next steps will involve integrating the parallel work of many different research teams that are working on the brain, the colony coordination behavior, the power source, and so on, until the robotic insects are fully autonomous and wireless.
The prototypes are still tethered by a very thin power cable because there are no off-the-shelf solutions for energy storage that are small enough to be mounted on the robot’s body. High-energy-density fuel cells must be developed before the RoboBees will be able to fly with much independence.
Control, too, is still wired in from a separate computer, though a team led by SEAS faculty Gu-Yeon Wei and David Brooks is working on a computationally efficient brain that can be mounted on the robot’s frame.
“Flies perform some of the most amazing aerobatics in nature using only tiny brains,” notes co-author Sawyer B. Fuller, a postdoctoral researcher on Wood’s team who essentially studies how fruit flies cope with windy days. “Their capabilities exceed what we can do with our robot, so we would like to understand their biology better and apply it to our own work.”
The milestone of this first controlled flight represents a validation of the power of ambitious dreams — especially for Wood, who was in graduate school when he set this goal.
“This project provides a common motivation for scientists and engineers across the University to build smaller batteries, to design more efficient control systems, and to create stronger, more lightweight materials,” says Wood. “You might not expect all of these people to work together: vision experts, biologists, materials scientists, electrical engineers. What do they have in common? Well, they all enjoy solving really hard problems.”
“I want to create something the world has never seen before,” adds Ma. “It’s about the excitement of pushing the limits of what we think we can do, the limits of human ingenuity.”
An experimental digital camera that mimics the compound eyes possessed by insects such as dragonflies, houseflies and praying mantises has been designed by scientists. The innovative camera, which can capture wide-angle photos without distorting the image, was built by a team of researchers along with a University of Colorado Boulder engineer.
Details of the camera have been described in the journal Nature. Using stretchable electronics and a pliable sheet of micro-lenses constructed from a similar material that is used in contact lenses, the camera permits a practically infinite depth of field and can capture a 160-degree-wide field of view.
Generally, in conventional wide-angle lenses, the images captured are distorted at the periphery due to the mismatch of light, but in the recent studies, researchers created an electronic detector that is capable of being curved into the same hemispherical shape as that of the lens without distorting the photos.
“The most important and most revolutionizing part of this camera is to bend electronics onto a curved surface,” Jianliang Xiao, assistant professor of mechanical engineering at CU-Boulder and co-lead author of the study said in a news statement. “Here, by using stretchable electronics we can deform the system; we can put it onto a curved surface.”
Compound eyes are made up of several smaller eyes known as ommatidia, in which each ommatidia consists of an autonomous corneal lens and crystalline cone that captures light travelling through the lens. By mimicking the corneal lens crystalline cone pairing, the camera was created having 180 miniature lenses, in which each camera was backed with a small electronic detector. The number of lenses in the camera matched that of the number of ommatidia present in the compound eyes of dark beetles and fire ants.
In a recent article posted to the Discoveries section of the National Science Foundation’s web site, writer Valerie Thompson, explores how scientists are trying to tap into the “unmatched ability of the human brain to process and make sense of large amounts of complex data” and utilize that in the development of ‘smart’ power grid systems currently being developed.
The research in this area is reportedly being led by Ganesh Kumar Venayagamoorthy, Ph.D., director of the Real-Time Power and Intelligent Systems Laboratory at Clemson University. His team of neuroscientists and engineers is using actual brain neurons, grown in a dish, to control simulated power grids. The team aims to find new methods for managing the country’s power supply by observing how neural networks integrate and respond to complex information.
“The brain is one of the most robust computational platforms that exists,” says Venayagamoorthy. “As power-systems control becomes more and more complex, it makes sense to look to the brain as a model for how to deal with all of the complexity and the uncertainty that exists.”
A little over a century ago, isolated power plants served power only to local customers. Today, an estimated 200,000 miles of power lines inter-connect more than 6000 powerplants in a power grid structure to ensure a backup of power across the country in the event that some plants fail. This makes efficiently managing all of that power a complex and delicate task.
The current grid system is severely outdated and struggles to keep up with the higher demands for electric power in today’s high-tech environment. Most powerplants were originally built 50 years ago when demands and consumption were significantly different. The technology and components in place at many power stations are outdated or have exceeded their life expectancy.
Additionally, in order to effectively incorporate renewable sources of ‘green energy’ including solar and wind into the grid system requires a currently unavailable ability to store energy for later release into the system.
“In order to get the most out of the different types of renewable energy sources, we need an intelligent grid that can perform real-time dispatch and manage optimally available energy storage systems,” says Venayagamoorthy. The Department of Energy currently estimates that a mere 5 percent increase in the efficiency of the power grid would save the energy equivalent to taking 53 million cars off the road.
Venayagamoorthy believes that by modeling the grid based on neural interactions in the human brain we can achieve vast improvements in the efficiency of the energy grid. “What we need is a system that can monitor, forecast, plan, learn, make decisions,” says Venayagamoorthy. “Ultimately, what we need is a control system that is very brain-like.”
To understand just how the brain brain integrates and responds to information, Venayagamoorthy and turned to neuroscientist Steve Potter, Ph.D., director of the Laboratory for NeuroEngineering at the Georgia Institute of Technology. Potter’s research involves growing neurons in a dish containing a grid of electrodes that can both stimulate and record activity. The electrodes connect the neuronal network to a computer, allowing two-way communication between the living and the electronic components.
“The goal is to translate the physical and functional changes that occur as living neuronal network learns into mathematical equations, ultimately leading to a more brain-like intelligent control system,” says Venayagamoorthy.
So far the researchers have successfully “taught” a living neuronal network how to respond to complex data, and have incorporated these findings into simulated versions called bio-inspired artificial neural networks (BIANNS). They are currently using the new and improved BIANNS to control synchronous generators connected to a power system, with the hope that this work will pave the way for smarter control of our future power grid.
IBM on Monday launched an appliance designed to manage data from mobile devices and sensors tangled up in the Internet of things.
More on M2M
The IBM MessageSight appliance is aimed at the auto, traffic management, healthcare, oil and gas and home appliance industries. Big Blue is basing its appliance on the Message Queuing Telemetry Transport (MQTT) technology. Last week, a bevy of technology companies said they would collaborate to create a standardized version of MQTT, which could enable more machine-to-machine applications.
MQTT, which is already used in sensors and medical devices, is a lightweight messaging system that will connect sensors and physical items to servers and networks. If successful, MQTT will enable data to be swapped between sensors, tablets, phones and data centers. MQTT will be standardized for low bandwidth, power and code and efficient distribution of data.
According to IBM, MessageSight can support one million concurrent sensors or smart devices. The appliance can also scale to 13 million messages per second.
MessageSight falls under IBM’s Smarter Planet initiative, which in part aims to monitor and analyze physical structures. IBM said it plans to connect MessageSight with its MobileFirst software. Typically sensors haven’t been able to communicate wirelessly, but MQTT enables low-power applications and real-time updates.