sábado, 12 de noviembre de 2011

Tlexistence robot avatar transmits sight hearing and touch

TELESAR V, a “telexistence” robot system being researched at Keio University, aims to free people from time and space constraints by using remotely operated robots to interact with the remote environment, reports DigInfoTV. The operator uses a 3D head mounted display that covers the entire field of view, to see exactly what the robot can see, and also hear. The sense of touch, recorded using force vectors and temperature data obtained by the robot’s sensors, is also transmitted to the operator, allowing them to feel the shape and temperature of objects


lunes, 24 de octubre de 2011

Diaceutics. Personalized Medicine

Diaceutics is a highly focused business consulting and software application firm specializing in personalized medicine. Our mission is to improve overall return on investment in personalized medicine by providing pharmaceutical development teams the personalized medicine specific knowledge, evidence, tools and operational structure to effectively commercialize and successfully launch targeted therapies.

We greatly improve how our clients view the value propositions in personalized medicine and help them to optimize their commercial plans to realize that value. We also speak to the dissimilar understanding of how personalized medicine works by dispelling widely held but erroneous beliefs through the use of real world evidence and case studies, allowing our clients to build consensus on the most advantageous path forward.

Drawing from our years of research wholly focused on the personalized medicine space, considering specifically how diagnostic markets can be used to influence and shape therapy markets, we work with our clients on how best to ensure uptake and adoption of companion and other diagnostic tests to optimize and achieve their therapy goals. We ensure our clients leave no stone unturned and no option unconsidered in their quest to be leaders in the personalized medicine space.

Track Record

Diaceutics has a proven track record working with many of the top pharmaceutical companies in the US and Europe on dozens of different personalized medicine assets across multiple therapy areas – including oncology, infectious disease, metabolic disorders, sepsis and cardiac disease – and throughout the development lifecycle.
Diaceutics standardized and structured process to effectively commercialize and launch companion diagnostics alongside drug development is organized within the same functional frameworks with which every pharmaceutical drug development team works. This significantly reduces the learning curve to optimal personalized medicine planning, thus saving valuable time and resources.

Diaceutics content rich approach to commercializing companion and other diagnostics, delivered in a standardized yet adaptable process, provides demonstrable success in moving pharmaceutical asset teams from thinking of personalized medicine as merely a test and a drug, to a complete understanding of how a companion diagnostic or diagnostics can unlock the true value of a therapy market.
Demonstrated Value

In today’s challenging economy, no industry can afford to invest time and resources without some clear indication of the likely return on those investments. And like every other high cost, high-pressure industry, the pharmaceutical industry requires clear analytics and functional metrics to provide hard evidence of the value of its investments.

Diaceutics tracks and provides hard evidence of the value of investment in companion diagnostics to enable asset development teams to quickly identify and verify that their resource allocation in companion diagnostics are driving test adoption levels. In turn, this allows asset teams to provide quantifiable metrics to management that their investments in personalized medicine are providing appropriate returns.

On average, leveraging Diaceutics standardized and structured process to effectively commercialize and launch companion diagnostics has delivered greater than $100 return for every $1 invested, as described in our case studies.
Diaceutics Fusion™ is a proprietary, web-based, software application that provides a range of business intelligence and performance management applications used for personalized medicine planning, collaboration and decision-making. Utilizing easy to navigate route maps and an intuitive user interface, the software incorporates the six years of research, case studies, worktools, data sets, financial models and expertise that Diaceutics have built around the co-development and commercialization of companion and other diagnostic tests. Diaceutics Fusion™ standardizes the planning, analytical, decision and action steps needed to align the co-development and successful commercialization of companion and other diagnostics.

Diaceutics Fusion™ is much more than a simple project management tool or a knowledge database – it directly plugs the power of Diaceutics experience into the hands of the pharmaceutical client and allows deployment on a number of levels:
  • Provides personalized medicine leaders a tangible resource to provide therapy area leads and asset development teams seeking insight, expertise and best practices to guide their personalized medicine planning;
  • Allows asset development teams to effortlessly work through the 11 key frameworks and decisions underpinning the additional 250 workstreams necessary to align co-development and commercialization of companion and other diagnostics.
  • Provides the ability to revisit and, where necessary, reconsider and revise decisions, while at the same time immediately capturing and incorporating those new decisions into the existing plans for seamless, iterative planning;
  • Creates a foundation upon which corporate knowledge of best practices and decisions in personalized medicine planning may be captured, stored and disseminated in order to build corporate knowledge and internal expertise around personalized medicine planning.

Click here for a demo of Diaceutics Fusion™
To learn more about the specific products offered in Diaceutics Fusion™, click on any of the subtabs above.
To learn more about how Diaceutics Fusion™ can be customized for your needs, click here.

*Roger Schank President/CEO, Socratic Arts Former Director, Yale Artificial Intelligence Project Founder, Institute of Learning Sciences, Northwestern University Co-Founder, Center for the Learning Sciences, Carnegie Mellon University

Tiny stamps for tiny sensors

Advances in microchip technology may someday enable clinicians to perform tests for hundreds of diseases — sifting out specific molecules, such as early stage cancer cells — from just one drop of blood. But fabricating such “lab-on-a-chip” designs — tiny, integrated diagonistic sensor arrays on surfaces as small as a square centimeter — is a technically challenging, time-consuming and expensive feat.

Now, an MIT researcher — together with colleagues at the University of Illinois at Urbana-Champaign — has come up with a simple, precise and reproducible technique that cuts the time and cost of fabricating such sensors. Nicholas Fang, associate professor of mechanical engineering, has developed an engraving technique that etches tiny, nano-sized patterns on metallic surfaces using a small, voltage-activated stamp made out of glass. Fang says the engravings, made of tiny dots smaller than one-hundredth the width of a human hair, act as optical antennae that can identify a single molecule by picking up on its specific wavelength.

“If you are able to create an optical antenna with precise dimensions … you can use them to report traffic on the molecular scale,” Fang says.  The researchers reported the new fabrication process in the Sept. 21 online edition of the journal Nanotechnology.

Hurdles to market

The new glass stamp approach may help researchers clear a large hurdle in lab-on-a-chip manufacturing: namely, scale-up. Today scientists fabricate nano-sensors using electron-beam lithography, an expensive and time-consuming technique that uses a focused beam of electrons to slowly etch patterns into metallic surfaces. The process, while extremely precise, is also extremely expensive: Fang says it’s common for facilities to rent such equipment out for $200 per hour. To fabricate a six-millimeters-squared pattern typically takes half a day — so if sensors made using electron-beam lithography were pushed into the commercial market, Fang estimates they would run more than $600 apiece.

“Nobody wants chips that expensive,” Fang says. “Biology tests are looking for something that’s cheap yet reliable. And that excludes some of the fancier, more expensive technologies.”

That may also exclude some cheaper technologies being developed today. For example, nanoimprint lithography is a simple, low-cost process where a moldable polymer is pressed onto a master circuit pattern. When exposed to UV light, the polymer hardens; when peeled off the master circuit, it forms a mold that can be filled with a metal substrate to make a copy of the original circuit pattern. Scientists typically wash the polymer mold away to isolate the new metallic pattern.

However, Fang says this approach, while inexpensive, can also be imprecise. The soft polymer material may not fit exactly around the original pattern, resulting in a mold with bumps, dents and other imperfections — and copies that aren’t exactly the same as the original. Since the process requires washing away the polymer mold, scientists need to use more polymer material to fabricate more copies.

A glass-blowing inspiration

Fang and his colleagues came up with a technique that may solve the cost, precision and reproducibility issues of other technologies. The team took an approach similar to nanoimprint lithography. But instead of polymer, the researchers used glass as a molding material. 

“I was inspired by glassblowers, who actually use their skills to form bottles and beakers,” Fang says. “Even though we think of glass as fragile, at the molten stage, it is actually very malleable and soft, and can quickly and smoothly take the shape of a plaster mold. That’s at a large scale, but amazingly it works very well at a small scale too, at very high speed.”

With this in mind, Fang and his team cast around for a glassy material that would meet their requirements, and found an ideal candidate in a form of superionic glass — glass composed partly of ions, which can be electrochemically activated when pumped with voltage.

The researchers filled a small syringe with glass particles and heated the needle to melt the glass inside. They then pressed the molten glass onto a master pattern, forming a mold that hardened when cooled. The team then pressed the glass mold onto a flat silver substrate, and applied a small, 90-millivolt electric potential above the silver layer. The voltage stimulated ions in both surfaces, and triggered the glass mold to essentially etch into the metal substrate.

The group was able to produce patterns of tiny dots, 30 nanometers wide, in patterns of triangles, rectangles and, playfully, an ionic column — the logo of the University of Illinois — at a resolution more precise than nanoimprint lithography.

“You end up with a better cut,” Fang says. “And we have a stamp that can be reused many times.”

In order to really make an impact on manufacturing sensors at a large scale, the group will have to prove that the stamp can be reused many, many times, according to S.V. Sreenivasan, professor of mechanical engineering at the University of Texas at Austin.

“It has the potential to be significantly lower cost for patterning metals such as silver,” Sreenivasan says. “However, a high-throughput process with long stamp life still needs to be demonstrated. Another valuable contribution might be to focus on recovering silver that is removed during the patterning of metal as this would further address cost-sensitive applications.”

Fang acknowledges that there are still cost barriers to this glass-etching process: It still requires using a master metallic pattern, made via expensive lithography. However, he points out that only one master pattern, and one glass stamp, is needed to mass-produce an entire line of the same sensor, which may bring large-scale production closer to reality.

“With this stamp, I can reproduce maybe tens of hundreds of these sensors, and each of them will be almost identical,” Fang says. “So this is a fascinating advancement to us, and allows us to print more efficient antennae."

MIT News Office. October 19, 2011

jueves, 20 de octubre de 2011

Seeing through walls in real time

The ability to see through walls is no longer the stuff of science fiction, thanks to new radar technology developed at MIT’s Lincoln Laboratory.

Much as humans and other animals see via waves of visible light that bounce off objects and then strike our eyes’ retinas, radar “sees” by sending out radio waves that bounce off targets and return to the radar’s receivers. But just as light can’t pass through solid objects in quantities large enough for the eye to detect, it’s hard to build radar that can penetrate walls well enough to show what’s happening behind. Now, Lincoln Lab researchers have built a system that can see through walls from some distance away, giving an instantaneous picture of the activity on the other side.


The researchers’ device is an unassuming array of antenna arranged into two rows — eight receiving elements on top, 13 transmitting ones below — and some computing equipment, all mounted onto a movable cart. But it has powerful implications for military operations, especially “urban combat situations,” says Gregory Charvat, technical staff at Lincoln Lab and the leader of the project.



Waves through walls

Walls, by definition, are solid, and that’s certainly true of the four- and eight-inch-thick concrete walls on which the researchers tested their system.

At first, their radar functions as any other: Transmitters emit waves of a certain frequency in the direction of the target. But in this case, each time the waves hit the wall, the concrete blocks more than 99 percent of them from passing through. And that’s only half the battle: Once the waves bounce off any targets, they must pass back through the wall to reach the radar’s receivers — and again, 99 percent don’t make it. By the time it hits the receivers, the signal is reduced to about 0.0025 percent of its original strength.



But according to Charvat, signal loss from the wall is not even the main challenge. “[Signal] amplifiers are cheap,” he says. What has been difficult for through-wall radar systems is achieving the speed, resolution and range necessary to be useful in real time. “If you’re in a high-risk combat situation, you don’t want one image every 20 minutes, and you don’t want to have to stand right next to a potentially dangerous building,” Charvat says.

The Lincoln Lab team’s system may be used at a range of up to 60 feet away from the wall. (Demos were done at 20 feet, which Charvat says is realistic for an urban combat situation.) And, it gives a real-time picture of movement behind the wall in the form of a video at the rate of 10.8 frames per second.

Filtering for frequencies

One consideration for through-wall radar, Charvat says, is what radio wavelength to use. Longer wavelengths are better able to pass through the wall and back, which makes for a stronger signal; however, they also require a correspondingly larger radar apparatus to resolve individual human targets. The researchers settled on S-band waves, which have about the same wavelength as wireless Internet — that is, fairly short. That means more signal loss — hence the need for amplifiers — but the actual radar device can be kept to about eight and a half feet long. “This, we believe, was a sweet spot because we think it would be mounted on a vehicle of some kind,” Charvat says.

Even when the signal-strength problem is addressed with amplifiers, the wall — whether it’s concrete, adobe or any other solid substance — will always show up as the brightest spot by far. To get around this problem, the researchers use an analog crystal filter, which exploits frequency differences between the modulated waves bouncing off the wall and those coming from the target. “So if the wall is 20 feet away, let’s say, it shows up as a 20-kilohertz sine wave. If you, behind the wall, are 30 feet away, maybe you’ll show up as a 30-kilohertz sine wave,” Charvat says. The filter can be set to allow only waves in the range of 30 kilohertz to pass through to the receivers, effectively deleting the wall from the image so that it doesn’t overpower the receiver.

“It’s a very capable system mainly because of its real-time imaging capability,” says Robert Burkholder, a research professor in Ohio State University’s Department of Electrical and Computer Engineering who was not involved with this work. “It also gives very good resolution, due to digital processing and advanced algorithms for image processing. It’s a little bit large and bulky for someone to take out in the field,” he says, but agrees that mounting it on a truck would be appropriate and useful.

Monitoring movement

In a recent demonstration, Charvat and his colleagues, Lincoln Lab assistant staff John Peabody and former Lincoln Lab technical staff Tyler Ralston, showed how the radar was able to image two humans moving behind solid concrete and cinder-block walls, as well as a human swinging a metal pole in free space. The project won best paper at a recent conference, the 2010 Tri-Services Radar Symposium.

Because the processor uses a subtraction method — comparing each new picture to the last, and seeing what’s changed — the radar can only detect moving targets, not inanimate objects such as furniture. Still, even a human trying to stand still moves slightly, and the system can detect these small movements to display that human’s location.

The system digitizes the signals it receives into video. Currently, humans show up as “blobs” that move about the screen in a bird’s-eye-view perspective, as if the viewer were standing on the wall and looking down at the scene behind. The researchers are currently working on algorithms that will automatically convert a blob into a clean symbol to make the system more end-user friendly. “To understand the blobs requires a lot of extra training,” Charvat says.

With further refinement, the radar could be used domestically by emergency-response teams and others, but the researchers say they developed the technology primarily with military applications in mind. Charvat says, “This is meant for the urban war fighter … those situations where it’s very stressful and it’d be great to know what’s behind that wall.
.

Related

Todd Kuiken: A prosthetic arm that "feels"

Physiatrist and engineer Todd Kuiken is building a prosthetic arm that connects with the human nervous system -- improving motion, control and even feeling. Onstage, patient Amanda Kitts helps demonstrate this next-gen robotic arm.

sábado, 15 de octubre de 2011

Bilateral hand transplant

Last week, a team of more than 40 surgeons, nurses, anesthesiologists, residents, radiologists and physician assistants of Brigham and Women’s Hospital (BWH), worked for more than 12 hours to perform a bilateral hand transplant for Richard Mangino, 65, of Revere. Mangino, a quadruple amputee, lost his arms below the elbows and legs below the knees after contracting sepsis in 2002.  

The transplant involved multiple tissues including skin, tendons, muscles, ligaments, bones and blood vessels on both the left and right forearms and hands.  In August 2010, BWH announced the development of a hand transplant program. The BWH team performed its first bilateral hand transplant in May 2011.  Consent for the donation of the hands was obtained by New England Organ Bank staff after conversations with the donor family. Registering as an organ and tissue donor on a driver’s license is not accepted as consent for this type of donation; family consent is required (fuente)

Hand transplantation surgery, the transfer of the hand(s) from a deceased human donor to a patient with amputation of one or both hands, is an experimental reconstructive procedure that has the potential to significantly improve the lives of hand amputees.
In early October 2011, a Brigham and Women’s Hospital (BWH) team of more than 40 surgeons, nurses, anesthesiologists, residents, radiologists, and physician assistants worked for more than 12 hours to perform a bilateral (double) hand transplant for Richard Mangino, 65, of Revere, MA. Mangino, a quadruple amputee, lost his arms below the elbows and legs below the knees after contracting sepsis in 2002. The transplant involved a composite of multiple tissues, including skin, tendons, muscles, ligaments, bones, and blood vessels on both the left and right forearms and hands.

The BWH hand transplant team, representing a wide variety of medical and surgical specialties, now hopes to build upon this success to provide other amputee patients with the significant benefits of hand transplantation. Toward this goal, BWH is actively seeking qualified candidates for our hand transplant research study. Our team will be studying a small group of people to learn more about how to advance the science of hand transplantation, how to support and limit transplant rejection issues, and how people do after hand transplantation.

We describe hand transplant surgery as a life-giving procedure because it has the potential to dramatically improve, i.e., restore, both a patient’s mental and physical health and his/her ability to function and integrate in society. However, as with any other type of organ transplantation, this improvement will require the patient to make a lifetime commitment to taking medications that suppress the body’s immune system.

Functionally, hand transplant surgery can provide a patient with new hands that, after extensive rehabilitation, allow him/her to perform daily activities and, in most cases, return to work. Furthermore, the ability to restore a near-normal aesthetic appearance of the hand(s) can lead to tremendous psychological benefits, including elevated confidence and mood.




Performing Miracles (no music) from BWH Public Affairs on Vimeo.


  • Fact sheet