07 January 2017

Lab-grown stomach gets scientists one step closer to a ‘human on a chip’

Lab-grown stomach

More people are affected by stomach diseases than heart disease. While in most cases this is in relatively minor ways, such as overproduction of acid or gastritis, in a growing number of instances it’s linked with gastric cancer which affects around 26,370 people a year in the United States alone.

To find out more about stomachs and the effect of bacteria such as helicobacter pylori, researchers at Cincinnati Children’s Hospital Medical Center created a “Petri dish stomach,” complete with the ability to produce acid and digestive enzymes.

“What my lab has been doing for over a decade is trying to generate human organ tissues in a Petri dish,” Dr. James Wells, lead investigator, told Digital Trends. “Organ tissues represent a really good way of investigating human disease on a level that you can’t do by studying patients.”

The work, published in the journal Nature, describes how a functioning “organoid” model of a mini stomach can be grown from pluripotent stem cells, which can be grown into any tissue in a person. By “growing” a stomach, researchers get to watch how exactly diseases affect that particular part of the body from what happens when too much acid builds up to how certain experimental drugs are able to help deal with inflammation.

human on a chip

A lab-grown piece of human stomach, as seen under a microscope.

“We turn the stem cells into something which is effectively a functioning mini-stomach,” Wells continued. “It’s only a few millimeters in size, but it can produce acid, digestive enzymes, and respond to the cues that trigger your stomach to respond in different ways. In other words, while they are small, [Petri dish stomachs] have the same physiological properties as an actual stomach.”

The eventual goal, he said, is to develop a “human on a chip,” which would take the form of a credit card-sized device containing similar organoids for every organ in the human body. Eventually, these could be used to help treat patients.

“Organs that have to be removed because of damage or disease are very hard to replace, outside of organ donors, who there are a real shortage of,” he said. “In the future, we think it should be possible to scale up these mini organs into something that is a therapeutic transplant. That is the direction we’re headed in.”

digitaltrends

25 December 2016

Artificial leaf could make a medicinal mini-factory

artificial leaf medicine

Inspired by a leaf, researchers at TU Eindhoven have developed a "mini-factory" that can use sunlight to manufacture chemical products such as drugs.

Leaves are kind of like nature's power plants, converting incoming sunlight into energy for the plant to thrive on. Inspired by the real thing, scientists have previously created artificial leaves that function in much the same way as their natural counterparts to produce electricity and even liquid fuels. Now a team at Eindhoven University of Technology (TU/e) is using a similar system to produce chemicals, which could one day lead to solar-powered "mini-factories" that can produce drugs, pesticides and other chemicals almost anywhere.

To mimic the light-capturing molecules in leaves, the researchers turned to luminescent solar concentrators (LSCs), materials seen in solar-harvesting window technology and used to catch and amplify laser beams carrying data in Facebook's drone-mounted internet project. These LSCs absorb incoming light, convert it to specific wavelengths and then guide the photons to the edges of the device.

The TU/e team's take on the idea was to create a leaf-shaped device, made from a silicon rubber LSC, with a thin channel running through it like the veins in a leaf. As chemicals are pumped through the channel, the LSC material directs sunlight towards it, and the high intensity of the sunlight can trigger a chemical reaction with the liquid in the channel. Essentially, one substance enters, and by the time it comes out the other end, the device will have converted it into a different chemical, which may be useful as a drug, fuel or other agent.

artificial leaf medicine


"Using a reactor like this means you can make drugs anywhere, in principle, whether malaria drugs in the jungle or paracetamol on Mars," says Timothy Noël, lead researcher on the study. "All you need is sunlight and this mini-factory."

These devices could prove a useful alternative to other means of drug production, which can require toxic chemicals and plenty of energy usually produced by fossil fuels. In early tests, the mini-factories exceeded the team's expectations for efficiency.

"Even an experiment on a cloudy day demonstrated that the chemical production was 40 percent higher than in a similar experiment without LSC material," says Noël. "We still see plenty of possibilities for improvement. We now have a powerful tool at our disposal that enables the sustainable, sunlight-based production of valuable chemical products like drugs or crop protection agents."

The research was published in the journal Angewandte Chemie. The team explains the device in the video below.



Source: Eindhoven University of Technology, newatlas

24 December 2016

This app gives every last corner of Earth an address

What3Words

Imagine being lost on the side of a mountain, or on a remote ski slope, and being able to tell rescuers your actual location in just three words. Or a system that could get mail to people in the favelas, or nomads on the Mongolian steppes, or places like the Faroe Islands (long on sheep, short on maps and roads). British startup What3Words has an app for that. More precisely, they have an algorithm with a GUI draped on top of it for user convenience, and an API so that others can integrate with the system.

Here’s how it works: They started by dividing the entire globe up into 57 trillion 3 x 3m squares (10′ on a side). Then they assign each square a fixed, permanent and unique 3-word address, using a sanitized pool of 25-40,000 dictionary words, depending on which of its nine currently supported languages the user prefers. Their algorithm converts each region of lat-long values into a value associated with a single 3-word string that really looks like the name of a niche IRC channel.

For example, the choice spot under the rain shelter at the metro station outside the science building at Vrije Universiteit in Amsterdam has an address: it’s (somewhat fittingly) searched.final.ambient. The decommissioned fire tower at the top of Hurricane Mountain, in the Adirondacks of New York, is corrosive.sculpture.assumed. With What3Words’ algorithm, it’s possible to address so specific a place. This means that you really can get mail to the cupboard under the stairs. It’s even possible to precisely address a place where the roads aren’t named and the houses aren’t numbered or a place in the wilderness, where there aren’t roads or houses at all.

what3words algorithm

The whole idea behind using words instead of lat-long is that it profoundly simplifies giving extremely specific locations. Lat-long coordinates are a mouthful, and they’re really unpleasant if you have to give them over a crappy audio channel. But combining extreme specificity with user-facing simplicity is the niche What3Words is seeking to fill.

Latitude and longitude are still better when computers are talking to one another, but when you involve humans, it gets messy. “If you have an injury on the slopes,” chief marketing officer Giles Rhys Jones told Magenta, “it’s incredibly difficult to describe where you are. The problem with using GPS coordinates is that if I’m trying to shout 18 digits on a telephone while I’m incredibly stressed, errors creep in.” The free What3Words app works without a data connection, and it’s tiny less than 10 MB so it’s lightweight enough for basically any mobile phone.

what3words callout

The system works well enough for the Mongolian government to wager their postal system on its quality. The World Bank estimates that a quarter of the Mongolian population lives a nomadic lifestyle. Many of them live on the steppes, where there are no roads, no permanent structures, and certainly no numbers to tell a bewildered postal worker where to go. But those people are still citizens who deserve representation, and especially the right to vote, which they can do by mail. This algorithm lets everyone have an address, because every place on the planet can be concisely described. (Boreholes, skyscrapers and the Hive from Resident Evil excluded, for the curious; adding another word, though, for a string of four, could allow for this kind of vertical-axis specificity… devs, are you listening?)

Santa clearly has a bespoke GPS app that uses an overlay based on the What3Words API, to enable present routing. For us mere mortals, there’s already a mapping app that speaks What3Words; it’s called navmii.

extremetech

05 September 2016

Sheep puns reach critical mass as Google finally gets involved in Sheep View

Google Sheepview

Somehow managing to badger Google both incessantly and without snark, Durita Dahl Andreassen has finally seen it happen: Google got formally involved in her homebrew project to put the heretofore unnoticed Faroe Islands on Street View.

There are 50,000 people in the Faroe Islands, but more than 70,000 sheep. The place is pretty rural, and up til now it hasn’t exactly been a hot property on the Google Street View acquisition list. There’s been a Street View camera inside the White House, at CERN, down Diagon Alley and even inside the TARDIS but never yet to the Faroe Islands. But with her campaign to get Street View making the news all over the web, suddenly that has changed. Where there’s a wool, there’s a way.

Andreassen started her project by teaming up with other Islanders to build bespoke camera harnesses that she then strapped to her sheep. Loosing them at particularly important or picturesque places around the Faroe Islands archipelago, she then collected the images and turned to the Internet for help. The project even received official endorsement from the tourism bureau of the Faroe Islands, making its way onto their website. They’ve been calling it Sheep View 360, after the 360 cameras the sheep are carrying.


Sheep aren’t really supposed to be on the roads, though, which presents an obvious difficulty when trying to get the roads mapped using Street View.

Google obviously heard about Sheep View 360, and it didn’t take long for them to figure out how to respond. They sent a Street View trekker and 360 cameras via their Street View camera loan program, and even dispatched a Google Maps team to the Faroe Islands to help train the locals, ensuring that the humans and sheep will both be capturing the absolute best images they can get.

Better still, it won’t just be sheep anymore. Faroe Islanders and tourists both can help collect Street View imagery of the remote, beautiful islands using “selfie-sticks, bikes, backpacks, cars, kayaks, horses, ships and even wheelbarrows.” This, too, has received official approval: the Visit Faroe Islands office in Tórshavn (hilariously, not yet findable on Street View), along with Atlantic Airways at the airport, will be lending out Street View 360 cameras to those willing to help out with the mapping adventure.

Only a true connoisseur of sheep-based humor will be able to make it out alive from Google’s blog post about Sheep View. Be thankful I wasn’t feeling clever when writing this, or the sheep puns would surely have killed you (ewe? Ed) by now.

source: extremetech

10 August 2016

BAE Systems wants to grow military aircraft in chemical vats

grow military aircraft in chemical


BAE Systems and the University of Glasgow foresee a time when new aircraft can be designed and chemically grown in a matter of weeks (Credit: BAE Systems)

Modern military aircraft are so complex that fighters like the F-35 Lightning II or the Typhoon take 20 years to go from drawing board to deployment at phenomenal costs. With design work already starting on next-generation fighters for the 2040s, BAE Systems and the University of Glasgow are looking at a faster, cheaper way to produce unmanned air vehicles (UAV), where they aren't constructed, but grown in computer-controlled chemical vats in a matter weeks.

This vision of the future of aircraft design and manufacturing was outlined ahead of the upcoming Farnborough International Airshow, which runs from July 11 to 17. The purpose of this concept isn't just to cut cost and the painfully long development cycle of military aviation hardware. It's also a reflection of the growing emphasis on swarms of smaller drone aircraft that can be built to custom specifications for specific missions over manned aircraft.

Such use of bespoke UAVs would require radically shorter development and manufacturing cycles, which inspired BAE's vision of growing them in huge chemical vats to create near-complete airframes and systems.

The key to this is the "Chemputer" a combination of the computer with chemical manufacturing. Originally developed by Regius Professor Lee Cronin at the University of Glasgow, and Founding Scientific Director at Cronin Group PLC, it's a sort of advanced 3D printer that works on a molecular level. It's original purpose was to use simple, locally-available chemicals to produce pharmaceuticals quickly and cheaply. Now, the technology is being envisaged as a way to produce full-blown aircraft and their electrical systems.

For the BAE concept, the Chemputer would be part of a system to enable the building of UAVs or multi-functional parts for large manned aircraft on a molecular level out of environmentally sustainable materials using advanced chemical processes. The result would be be to allow mission specific drones to be built in a very short timeframe. Developers could choose from a menu of capabilities and the Chemputer would bring together the necessary technologies and grow them.

In this way, fleets of small drones that could be made quickly to carry out a variety of missions. They could drop supplies to special forces, carry out surveillance, or operate at speeds and altitudes that would make them invulnerable to anti-aircraft missiles.

"This is a very exciting time in the development of chemistry," says Cronin. "We have been developing routes to digitize synthetic and materials chemistry and at some point in the future hope to assemble complex objects in a machine from the bottom up, or with minimal human assistance. Creating small aircraft would be very challenging but I'm confident that creative thinking and convergent digital technologies will eventually lead to the digital programming of complex chemical and material systems."

The animation below shows how the warplanes of the future might be created.


Source: BAE Systems, gizmag

10 July 2016

Computer coughs up passwords, encryption keys through its cooling fans

hackers can hear what you speak using cpu cooling fan

Here’s a security update to haunt your dreams, and to make the FBI’s quest for un-exploitable cryptographic backdoors look all the more absurd: a team of Israeli researchers has now shown that the sounds made by a computer’s fan can be analyzed to extract everything from usernames and passwords to full encryption keys. It’s not really a huge programming feat, as we’ll discuss below, but from a conceptual standpoint it shows how wily modern cyber attackers can be and why the weakest link in any security system still involves the human element.

In hacking, there’s a term called “phreaking” that used to refer to phone hacking via automated touch-tone systems, but which today colloquially refers any kind of system investigation or manipulation that uses sound as its main mechanism of action. Phone phreakers used to make free long distance phone calls by playing the correct series of tones into a phone receiver but phreaks can listen to sounds just as easily as they can produce them, often with even greater effect.


curiosity hackers

That’s because sound has the potential to get around one of the most powerful and widely used methods in high-level computer security: air-gapping, or the separation of a system from any externally connected network an attack might be able to use for entry. (The term pre-dates wireless internet, and a Wi-Fi-connected computer is not air-gapped, despite the literal gap of air around it.)

So how do you hack your way into an air-gapped computer? Use something that moves easily through the air, and which all computers are creating to one extent or another: Sound.


One favorite worry of paranoiacs is something called Van Eck Phreaking, in which you listen to the sound output of a device to derive something about what the device is doing; in extreme cases, it’s alleged that an attacker can recreate the image on the screen of a properly mic’ed up CRT monitor. Another, more recent phreaking victory showed that it is possible to break RSA encryption with a full copy of the encrypted message and an audio recording of the processor as it goes through the normal, authorized decryption process.

chinese military at computers possibly hacking


Note that in order to do any of this, you have to get physically close enough to your target to put a microphone within listening range. If your target system is inside CIA Headquarters, or Google X, you’re almost certainly going to need an agent on the inside to make that happen and if you’ve got one of those available, you can probably use them to do a lot more than place microphones in places. On the other hand, once placed, this microphone’s security hole won’t be detectable in the system logs, since it’s not actually interacting with the system in any way, just hoovering up incidental leakage of information.

This new fan-attack actually requires even more specialized access, since you have to not only get a mic close to the machine, but infect the machine with a fan-exploiting malware. The idea is that most security software actively looks for anything that might be unusual or harmful behavior, from sending out packets of data over the internet to making centrifuges spin up and down more quickly. Security researchers might have enough foresight to look at fan activity from a safety perspective, and make sure no malware turns them off and melts the computer or something like that, but will they be searching for data leaks in such an out of the way part of the machine? After this paper, the answer is: “You’d better hope so.”


Stuxnet virus life cycle

A diagram of the life-cycle of the Stuxnet virus.

The team used two fan speeds to represent the 1s and 0s of their code (1,000 and 1,600 RPM, respectively,) and listened to the sequence of fan-whines to keep track. Their maximum “bandwidth” is about 1,200 bits an hour, or about 0.15 kilobytes. That might not sound like a lot, but 0.15KB of sensitive, identifying information can be crippling, especially if it’s something like a password that grants further access. You can fit a little over 150 alpha-numeric characters into that space that’s a whole lot of passwords to lose in a single hour.

There is simply no way to make any system immune to infiltration. You can limit the points of vulnerability, then supplement those point with other measures that’s what air-gapping is, condensing the vulnerabilities down to physical access to the machine, then shoring that up with big locked metal doors, security cameras, and armed guards.

But if Iran can’t keep its nuclear program safe, and the US can’t keep its energy infrastructure safe, and Angela Merkel can’t keep her cell phone safe how likely are the world’s law enforcement agencies to be able to ask a bunch of software companies to keep millions of diverse and security-ignorant customers safe, with one figurative hand tied behind their backs?


FBI


On the other hand, this story also illustrates the laziness of the claim that the FBI can’t develop ways of hack these phones on their own, a reality that is equally distressing in its own way. The FBI has bragged that it’s getting better at such attacks “every day,” meaning that the only things protecting you from successful attacks against your phone are: the research resources available to the FBI, and the access to your phone that the FBI can rely on having, for instance by seizing it.

Nobody should be campaigning to make digital security weaker, to any extent, for any reason as this story shows, our most sensitive information is already more than vulnerable enough as it is.

extremetech

"Wearable" for plants to let you converse with a chrysanthemum

plant speaks

The Phytl Signs device picks up the tiny electrical signals emitted by plants.

Houseplants have never been known as great conversationalists, but it's possible we just can't hear what they're saying. Swiss company, Vivent SARL, is hoping to rectify that with its Phytl Signs device that picks up the tiny electrical signals emitted by plants and broadcasts them through a speaker. The ultimate goal is to translate what the plants are actually "saying."

chat with plantschat with plants

speak with plantsspeak with plants

The system, which is currently the subject of a crowdfunding campaign, features two receptors – a stake that is inserted into the soil next to the plant, and a clip that gently connects to a leaf. These measure the voltage coming from the plant, which feeds into a signal processor. From there the plant-speak is output through a built-in speaker. A smartphone app can also receive raw data from a plant, allowing analysis of the signals using data analysis software.

Unlike current plant monitors on the market that measure environmental metrics like soil moisture and sunlight, the Phytl Signs device is claimed to pick up on whether your plant is thriving or stressed, active or quiet, or besieged by pests. The plant responds immediately to a change in lighting or the cutting of a leaf with a spike in sound, which is an electronic howl akin to a theramin. But decoding what the audio output means is still being worked out by the company.

To that end, the company encourages device owners to share their data with an online community of fellow users, allowing the company to crowdsource the data to help them decode and translate the plant signals so they can be understood.

Ultimately, if and when the signals are translated, it would allow plant owners to provide the best growing conditions possible. The company also envisions using the devices for agriculture research, and on a commercial scale to monitor crops and potentially improve yields and minimize water use. It can be used on any plant as long as the leaf is wide enough for the clip to connect.

The company has launched a Kickstarter campaign to produce its gadgets, improve its software and further study what the plant signals mean. The minimum pledge level for an Explorer kit is CHF129 (approx. US$130), with shipping slated for April, 2017 if everything goes as planned.

Source: Phytl Signs, gizmag

Google’s ‘FASTER’ undersea cable goes online with 60 Tbps of bandwidth

Google FASTER

You probably have a wireless network at home, but for some applications a wired connection is still more reliable. It’s the same in internet backbone communications satellites help keep the world in sync, but the best connections across the globe rely upon undersea fiber optic cables. A new undersea cable constructed with Google’s backing has just gone online linking the US west coast with Japan.
The cable, which has the fitting name “FASTER,” can transmit 60 terabytes of data per second, more than any other active undersea cable. It’s about 10 million times faster than your home broadband connection on a good day. The new cable will benefit users near one end or the other when they need to ping a server on the other end. It doesn’t boost their own bandwidth, but it could allow them to take fuller advantage of it. FASTER also includes an additional connection from Japan to Taiwan, which has 20 Tbps of bandwidth and is owned completely by Google.

Google joined this ambitious construction project back in 2014 when it partnered with five other companies: NEC, China Mobile, China Telecom, Global Transit, and KDDI. The project has cost about $300 million to complete, but it will offer huge speed increases for data transmission between Asia and North America. Google’s participation in the project guarantees it 10 Tbps of dedicated bandwidth on the FASTER cable. Google is also planning to launch its Google Cloud Platform East Asia in Tokyo this year. The dedicated bandwidth from FASTER will result in faster transfers and lower latency for its customers.

Google FASTER undersea cable map

FASTER stretches some 9,00 kilometers (5,592 miles) across the ocean. The US end is in Bandon, Oregon, and the Japanese end plugs into Shima and Chikura. The US cable location places it very near to Google’s data center in The Dalles. FASTER uses six fiber pairs to push all that bandwidth using 100 different wavelengths of light. Every 60 kilometers, there’s a repeater that re-energizes the data to ensure no data is lost along the way, according to Google’s senior vice president of technical infrastructure Urs Hölzle.

This cable might be the speed king right now, but that won’t be the case for long. Earlier this year, Microsoft and Facebook announced they would be laying a cable from the US to southern Europe with a capacity of 160 Tbps across eight cable pairs. I guess Google will just have to limp along with FASTER.

extremetech

12 April 2016

Experimental battery uses bacteria to charge and recharge

bacteria battery

Rechargeable battery technology has been improving incrementally in recent years, but we’re still working with the same heavy, dangerous, expensive materials. A group of researchers from The Netherlands has devised a new biological battery that charges and discharges with the aid of bacteria. They’ve tested this system on the small scale and managed 15 charge cycles in a row.

This “bioelectrochemical” battery consists of two parts. There’s a microbial electrical synthesis (MES) module that takes electrons and uses them to generate acetate. This is a metal salt that can be used to store electrical charge. The other side of the battery is a microbial fuel cell (utilizing various anaerobic bacteria) that processes that acetate via reduction/oxidation, resulting in the release of electrons. These are then fed into a circuit to harvest the power that was stored in the first step. More power can be added to the MES system to recharge, and the whole process starts over again.

The team tested this design by feeding power in over the course of 16 hours. It then provided power over the course of 8 hours. Does that sound like it might mesh well with any particular type of technology? Yep, it’s a great match for solar power, and indeed that’s the application the researchers have in mind. In areas that have lots of sunlight, there’s an almost unlimited supply of power during the day, but you have to store that power for use at night.

bacteria battery working

The bacterial battery described in the paper might be ideal for storing energy from solar power, but first some improvements need to be made. For one, the efficiency isn’t what we’d expect from a modern lithium-polymer battery. The team reports roughly 30-40% cycle efficiency, compared with upward of 80% in the best batteries we have now. The bacterial batteries would also need a bit more care than a lithium-ion system. If the bacteria inside were to die, the battery would stop working.

Despite these shortcomings, the study authors believe that this is an important first step. The study includes data from 15 charge cycles of the battery, and it maintained very consistent performance throughout. The self-renewing nature of bacterial colonies might mean this approach has better longevity than lithium-ion, which only works for a few hundred cycles.

With additional research, bioelectrochemical batteries may have similar capacity and efficiency compared with conventional ones, but with much lower costs and fewer volatile chemicals. Like so many other proposed battery technologies, this one is a few years off.

Source: extremetech

Ultrasound makes for palm-based computer displays you can feel

Ultrasound palm computer
Researchers at the University of Sussex are working to augment palm-based displays by adding tactile sensations to the mix.

From buzzing phones to quivering console controllers, haptic feedback has become indispensable in modern computing, and developers are already wondering how it will be felt in systems of the future. Sending ultrasound waves through the back of the hand to deliver tactile sensations to the front might sound a little far-fetched, but by achieving just that UK scientists claim to have cleared the way for computers that use our palms as advanced interactive displays.

For years now scientists have been chipping away at the idea of using human skin as a computer display. It sounds unlikely, but with technology becoming more miniaturized, the uptake in wearable devices and more time spent gazing into computer screens, in some ways it seems natural that we use our most readily available surfaces as gateways to the digital realm.

While we're not expecting the very next Fitbit to project your calories burned onto your forearm, some promising prototypes have emerged in this area. The Skinput display system from 2010 used a bio-acoustic sensing array to translate finger taps on the palm into input commands, while the Cicret wristband concept from 2014 envisioned beaming an Android interface onto the arm and used proximity sensors to follow finger movements.

Researchers at the University of Sussex are working to improve palm-based displays by adding tactile sensations to the mix. Importantly, they are aiming to do so without using vibrations or pins, approaches they say have plagued previous efforts as they require contact with the palm and therefore disrupt the display.

So they are looking to sneak in the back door. Their SkinHaptics system relies on an array of ultrasound transmitters that when applied to the back of the hand, send sensations to the palm, which can therefore be left exposed to display the screen.

The team says it was able to achieve this through something it calls time-reversal processing. As the ultrasound waves enter through the back of the hand they begin as broad pulses that actually become more targeted as they move through to the other side, landing at a specific point on the palm. The researchers liken it to water ripples working in reverse.

"Wearables are already big business and will only get bigger," says Professor Sriram Subramanian, who led the research. "But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important. If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user. What we offer people is the ability to feel their actions when they are interacting with the hand."

You can see a prototype of the SkinHaptics system demonstrated in the video below.


Source: University of Sussex, gizmag
Get every new post delivered to your Inbox.

 

Copyright © 2015 Tracktec. All rights reserved.

Back to Top