Like shopping?

Engineering



Over the years, robots have changed from being stationary and simply performing one specific task to doing varied activities on its own and these days we have robots automatically performing almost all the activities to assist humans such as cooking, dancing, playing, serving and not to forget robots that can even show emotions. Following these very lines of innovation come the Care-O-bot 3
Care-O-bot 3 is the next-generation serving robot developed by research scientists at the Fraunhofer Institute in Stuttgart, Germany that is mainly intended to help humans in the household in their daily life. This highly flexible one-armed robot is popularly known as the "Robot That Serves" as it can carefully pick up bottles, cups and similar objects with its three fingers and serve them to you.
Care-O-bot3 at IREX 2009
To find the items it needs to pick up correctly, Care-O-bot 3 is fitted with stereo-vision color cameras, laser scanners and a 3-D range camera that can register its surroundings in three dimensions in real time. Care-O-bot 3's database consists of three-dimensional impressions of numerous household articles to perfectly differentiate between items it need to pick up. Users can also make the robot to recognize new objects by placing any unfamiliar object in its hand and it can automatically gain a 3D impression of that item.

Care-O-bot 3 is only 1.45 meters high and based on a omnidirectional platform with four separately steered and driven wheels that enable the robot to move in any direction with ease. Other important feature of Care-O-bot 3 include ability to be directed by spoken commands, recognizing and responding to some of the gestures and inclusion of sensors to prevent it from touching humans inadvertently.
Continue Reading...

The blooming technology has taken deep roots in every field nowadays. It is impossible for anyone to imagine a world without high computing environment. It is the worst nightmare for any organization to imagine its functioning without high end automated systems. In management field the computer plays a vital role directly or indirectly. At all the 3 levels of management i.e. operational level, Middle Level and High level wide use of computers is made. Let us see how the computer is essential for the levels of management

In Operational level of any organization there are thousands of transactions to be performed daily. The transactions carried out help to improve the routine business activity and affect the overall performance of any organization. The transactions may include calculations, summarizing or sorting of data. Most of the organizations have automated computer systems for handling their transactions. The use of computers drastically increases the speed at which the transactions occur and provide greater accuracy. The main advantage is that the computers can be programmed and changed from time to time with change in activities.

The middle level management benefits the most by the use of computers and automated systems. The computer helps the manager to take crucial decisions and helps in solving problems. With computers the manager can take better decisions and can draw conclusions with help of precise data in no time. Preparing daily reports in graphical format makes it easier for the manager. The rise and the falls in employee’s performance can be easily traced with several automated systems.

In most of the companies the top level management uses the executive information systems which are structured and automated tracking systems. These systems provide the top level management with rapid access to timely information. The major advantage is that the systems provides the top level management with effective updates of slightest changes in the working conditions and abreast them with what is happening in the major areas.
Continue Reading...

Continuous Inking Systems (CIS) are aftermarket kits for inkjet printers. These systems supply ink to special cartridge through tubes which are connected to the bottles or reservoirs. The user should not worry of the cartridge running dry or about replacing them. The user can simply add ink to the bottles when they run low. The Continuous Inking Systems are especially designed to cover most Canon, HP, Epson and other printers.

 The bottles which are the part of the inking system are placed alongside the printer. The number of bottles varies according to the model of the printer. Every bottle is filled with proper color ink. In this system a tube runs from the bottle cap to the ink cartridge inside the inkjet printers. These tubes are bound with each other into a "ribbon" cable. The Continuous Inking Systems provides the users with a syringe for primary setup to clear the tubes with air and help them draw ink.

Continuous Inking Systems are capable of large runs as photographs and other printing jobs can be carried out very easily without worrying about running out of ink. Other advantage of this sophisticated system is that it is cost-effective as the cartridge included is specially designed for longevity.

The installation of Continuous Inking Systems is very easy and it takes only 5-10 minutes depending upon the kind of the kit and the printer. It comprises of two major steps-loading the cartridge and then installing the inks along with the carts and the tubes.

Care must be taken so that the reservoir bottles are not positioned higher than the printer as the gravity could make the ink to flood which causes improper printing and affects printing quality. While installing the systems no permanent changes are made in the printer and the user can easily change it back to the standard cartridge. Continuous Inking Systems are easy to install and use and can be purchased online from the company dealer directly.
Continue Reading...

Software development is a set of activities which when performed together in accordance with each other helps to produce the desired result. The software development methodologies are used for computer information system. The growth of the information has to pass through various stages which are together referred as Software Development Life Cycle (SDLC).
The development of the software is an interactive process and comprises of following identifiable stages:
Preliminary Investigation, under this stage the real problem of the existing system is identified and several researches are conducted in order to identify the root cause of the problem and ways to solve it. The primary investigation further consist stages like initiating request, Feasibility Analysis and clarification of the request.

System analysis is another vital stage in Software Development Life Cycle, under which existing procedure and information flow is traced and reviewed. The system analysis constitutes definition of the system, separation of system into smaller manageable parts and understanding the nature, function and interrelationship amongst the sub system.

In System Design stage the actual physical system is designed with reference to the logical design by specialists. Under this stage the design process commences by identification of outputs the new or modified system will produce. The analysts and designers make use of automated system design tools or software for creating a tentative design sketch.

Once the appropriate design is selected for the system, the design is send to the development and coding stage for translating it into the machine language. Skilled programmers are hired for the system coding.

Once the software is ready it is send to the testing phase where it is thoroughly tested for any errors and the analyst makes sure that the software does not fail under any conditions. The Testing stage comprises of checking for logical inter-phases, designing test cases, checking the quality of code and its adherence.

The final and vital stage of all is implementation and Evaluation of the system where the user is trained on how to use the new software and check that the application runs smoothly on the actual system without any flaws.
Continue Reading...

The Dvorak keyboard was initially designed with an aim of maximizing typing efficiency. The Dvorak keyboard got its name after its inventor Dr. August Dvorak. This keyboard was designed in1930s by August Dvorak, a professor of education and his brother-in law William Dealy. Unlike the traditional QWERTY keyboard this keyboard is designed so that the middle row of keys consists of the most common letters. Also the common letters are situated in such a way that they can be typed swiftly.

The standard Dvorak keyboard has two additional keyboards, a left-handed keyboard and a right-handed keyboard. These keyboards are designed for the people who use only one hand for typing. The Dvorak keyboard layout is available in the control panel option on every modern computer. This keyboard is immensely more comfortable than the old-standard "QWERTY" pattern which provides the user with no efficient attempt at typing comfort. This keyboard was scientifically designed to increase the speed as well as accuracy.

Unlike the QWERTY the Dvorak keyboard is much easier to learn, especially for new typist. The beginning lessons are more productive and interesting as the user can type thousands of real words on the home row itself. Accuracy is another advantage of the Dvorak layout as its users tend to make fewer mistakes while typing. Most of the people switch to Dvorak as it is more comfortable. The Dvorak layout is carefully designed to fit the English language. While on the other hand QWERTY has random layout.

Dvorak makes typing more natural and easier. Many people find switching to Dvorak as a seamless way of learning touch-typing. Amazing feature of the Dvorak keyboard is its speed, accuracy, comfort, less finger travel, easy to learn and portability. Although the Dvorak keyboard has failed to replace the QWERT, it has become much easier to access in the computer age that consists of all major operating systems like Linux and Microsoft Windows.
Continue Reading...
Even your best raincoat would end up being wet when soaked under water for longer periods. But now a new waterproof material is being developed by Swiss chemists which never gets wet even though completely soaked underwater.

The natural example of extreme water resistance can be seen on the surface of Lotus leaves. Similar to that, a new clothing which never gets wet has been developed by the Scientists which makes use of combination of water repelling substances and tiny Nano structures.

Lead Researcher Stefan Seeger at the University of Zurich says the fabric, made from polyester fibers coated with million of tiny silicon filaments, is the most water-repellent clothing-appropriate material ever created.“The water comes to rest on the top of the nanofilaments like a fakir sitting on a bed of nails,” he says.Drops of water stays as spherical balls on the top of this fabric..and only with a tilt of 2 degrees of the clothing from horizontal they roll off like marbles. Also, a jet of water when thrown on the fabric bounces off it without a trace.

The secret to this incredible water resistance is the layer of silicone nanofilaments, which are highly chemically hydrophobic. The spiky structure of the 40-nanometer-wide filaments strengthens that effect, to create a coating that prevents water droplets from soaking through the coating to the polyester fibres underneath.

The clothing thus made like when soaked completely under water even for longer periods of time, it will be as dry as the day it went in. And the same technique can be implemented with wool and cotton clothing also besides Polyester. So this type of clothing would be best suited to design Swimsuits n raincoats.

Neways lets hope it soon comes to the market and with the use of this we can go out in rain without the fear of ever getting wet..

Continue Reading...

Medical monitoring by NANOSENSORSWarning signs: Vista Therapeutics is commercializing nanowire sensors for detecting early warnings of organ failure in trauma patients. Arrays of silicon nanowires like those on the chips above can detect individual proteins in unprocessed blood samples. The nanowires are labeled with antibodies, greenish-gray blobs in the chip image at right. The antibodies are also visible in the fluorescent images at left.
Credit: Vista Therapeutics


Physicians often test the levels of a few telltale blood proteins in seriously injured or ill patients to detect organ failure and other problems. Now Vista Therapeutics, a startup based in Santa Fe, NM, hopes to improve the care of these patients with sensitive devices for continuous bedside monitoring of such blood biomarkers.
Instead of taking daily snapshots of the patient's levels of blood proteins, the company's nanosensors should allow for continuous monitoring of changes that occur over periods of only a few hours.

Spencer Farr, CEO of Vista Therapeutics, says that the first application of the technology will be for careful monitoring of patients whose status can change rapidly--such as those in the ICU after suffering a heart attack or traumatic injuries from a car accident. "We envision having a branch in the patient's IV that tests continuously or every five to ten minutes," says Farr. The nanowires are sensitive enough that they should be able to detect trace biomarkers that diffuse into the IV line from the blood. After a car wreck, for example, patients could be closely monitored for molecular warning signs of impending kidney and other organ failure.

To make the detectors, Vista Therapeutics has licensed nanowire sensing technologies developed by Harvard University chemist Charles Lieber. Silicon nanowires, semiconducting wires as thin as two nanometers, have what Lieber calls the "ultimate sensitivity," even with completely unprocessed samples such as blood. When a single protein binds to an antibody along the wire, the current flowing through the wire changes. Arrays of hundreds of nanowires, each designed to detect a different molecule in the same sample, can be arranged on tiny, inexpensive chips. The changes can be monitored continuously as molecules bind and unbind, making it possible to detect subtle trends over time, without requiring multiple blood draws.

The standard protein-detection technique, ELISA, is very sensitive but, Farr says, takes 90 minutes to perform. It starts with a blood draw that must be extensively processed--first to purify the proteins, then to label them with fluorescent dyes--and then tested with expensive imaging equipment in a hospital lab. "ELISA is a powerful technology for one-time measurements," says Farr, "but there's no existing technology for continuous biomarker measurement."

The sensitivity of nanowire detectors should also open up the possibility of finding new biomarkers. The blood biomarkers that doctors routinely test for--including prostate-specific antigen for cancer screening and c-reactive protein, a sign of heart failure--can be monitored with ELISA because their levels change over days or weeks. Because nanowire sensors allow for extremely sensitive, continuous monitoring, they should allow doctors to monitor the levels of blood proteins and other molecules whose concentration changes over a much shorter timescale. Changes in these biomarkers are currently undetectable. "We expect we'll be able to include those that change rapidly, peaking within a matter of a few hours," says Farr. Because it hasn't been practical to make such measurements before, it's not clear just what these biomarkers will be, but Farr hopes that Vista will uncover them.

Initially, Vista will market clinical devices for monitoring known biomarkers in IV lines. In the future, the company might develop implantable chips for patients with chronic diseases such as diabetes. A nanowire chip in an artery in the wrist might continuously monitor blood glucose and proteins indicative of early liver damage and other diabetic complications. The device could send alerts to a wristwatch. Because nanowires are so sensitive and inexpensive, they could also find their way into home tests for cancer, where early detection is key, says Farr.


via [technologyreview]
Continue Reading...
HP labs MEMRISTOR
An image from an atomic force microscope shows a circuit with 17 memristors lined up in a row. Every memristor shares the same bottom wire (top left to bottom right), and each has its own top wire. The memristors are between the wires.
Credit: J. J. Yang, HP Labs

Researchers at HP Labs have fabricated a memristor, or memory resistor--a fundamental electronic device that had been described theoretically but never produced until now.
The amount of charge that flows through the device can be changed by exposing it to an electrical voltage. Applying a positive voltage lowers its resistance, and applying a negative voltage increases it. Furthermore, the change in resistance is proportional to the length of time the voltage is applied: the more the device is charged, the more electricity it conducts. Once set, the resistance stays the same until it's reset.

WHY IT MATTERS::The memristor could be useful for nonvolatile memory. Memristors could lead to ­nonvolatile memory chips that store more data than flash ­memory. They could also be used in processors designed to mimic aspects of the human brain. In the brain, learning depends on changes in the strength of connections between neurons. The memristor can be used to set the strength of connections between transistors, achieving a similar effect. Chips using memristors could be useful for face recognition and for controlling robot movement.

Methods:
The new memristors consist of two layers of titanium dioxide sandwiched between two electrical contacts. One layer of titanium dioxide is an insulator, blocking the flow of electrons from one contact to the other. The other layer, which has fewer oxygen atoms than titanium dioxide normally does, conducts electricity.

When a voltage is applied, some of the oxygen ions move from the first layer into the oxygen-­deficient layer. That improves the conduc­tivity of the first layer, allowing electrons to pass through the memristor from one contact to the other.

Next steps: Since Hewlett-Packard doesn't make ­memory chips, the tech­nology will probably be licensed to another company for product development. The HP researchers are working on a prototype that combines transistors and memristors to form a brainlike chip.
Continue Reading...
IBM Noise Free Nano labs
Believing that shielded labs are vital to the future of nanoelectronics, IBM announced plans to build the world's largest "noise free" nanoelectronic fabrication facilities in Switzerland. By shielding equipment from external electromagnetic, thermal, and seismic noise, the new facilities should help advance research in a wide range of fields, such as spintronics, carbon-based devices, and nanophotonics, says IBM.


As electronics research shifts to ever smaller scales, a stable laboratory environment becomes increasingly important, says Matthias Kaiserswerth, director of the IBM Zurich Research Laboratory. If you're trying to design a new transistor by manipulating individual electrons moving through a carbon nanotube, any disturbance--a truck rumbling past or a nearby vacuum cleaner--can disrupt your experiment and leave you with irreproducible results.

"What we're trying to get to is something that is truly noise free, shielding against all these influences," says Kaiserswerth. Eventually, Kaiserswerth says, these kinds of facilities will become for nanoelectronics what clean rooms are for conventional silicon electronics.

But Xiang Zhang, director of the Nano-Scale Science and Engineering Center at the University of California, Berkeley, says that it's precisely IBM's willingness to take risks with its new facility that will create excitement in the nanotech community. "This is a good sign," he says.

The new labs are part of a $90 million, 65,000-square-foot facility being built by IBM in collaboration with the Swiss Federal Institute of Technology, also in Zurich. One-third of the $90 million will go toward building 10,000 square feet of clean-room facilities and 2,000 square feet of noise-free labs. Although nanotech labs elsewhere are shielded in various ways, says Kaiserswerth, "this 200 square meters will be unique. These noise-free labs will give us a competitive edge so we can move forward faster."

"IBM is in the business of making computer chips," says Kaiserswerth. "But we have been struggling in the last few years to meet Moore's Law in terms of doubling the number of transistors on a chip and doubling the clock rate." Techniques that the industry has traditionally used to increase circuit density are beginning to bump up against silicon's fundamental physical limits. So many companies and research centers are trying to develop novel ways to store information and perform computations.
Continue Reading...

Skin-Tenna wrist watchThese antennas are small enough to be worn discreetly under clothes, and could be as thin as 1cm.

A wireless antenna that channels signals along human skin could broadcast signals over your body to connect up medical implants or portable gadgets.


The new power-efficient approach could make more of established medical devices like pacemakers or help future implants distributed around the body work together.

Just one of the small hockey-puck-like antennas developed at Queen's University Belfast, Northern Ireland, would be able to connect to gadgets anywhere else on the body, says William Scanlon who made the design with colleague Gareth Conway.

The new design's ability to produce signals that creep along the skin makes it more efficient than existing battery-hungry technologies such as Bluetooth, says Scanlon – an important factor for medical devices which need long life-spans.

By producing wireless signals that creep along the skin an antenna at the hip could connect to gadgets anwhere else on the body.

The designers claim that it would be more power-efficient than Bluetooth, which can be important when dealing with long-term medical implants.

Here's how it works:
wireless signals over human body

Compact "patch" antennas that lie flat on the skin have been made before. But they make poor connectors because most of their signals travel away from the body, not along it.

Mast-style 'monopole' antennas like those on cars are better at transmitting laterally. But still transmit upwards too.

Now, Scanlon and Conway have designed a version that that channels much more of its signal sideways by taking advantage of the "creeping wave" effect that allows waves to travel along a surface. The same effect is responsible for both a person's ears hearing a sound only directed at one side of their head.

NEC has a wearable wideband antenna that can be integrated into clothing, in case a skin network is too "creepy" for you.

The future may be filled with wearable device are powered by body heat or water or bacteria and communicate by sending signals along the surface of our skin. Now that’s a future that sounds exciting. Isn't it??
Continue Reading...

Energy from waste - Gasification PlantEasy viewing: Gasification plants that convert municipal waste into energy and by-products can be built squat and stackless, according to Canadian developer PlascoEnergy. This artist’s rendering shows the 400-metric-ton-per-day facility that PlascoEnergy plans to build in Ottawa, Canada’s capital.
Credit: PlascoEnergy




This week, city counselors in Ottawa, Ontario, unanimously approved a new waste-to-energy facility that will turn 400 metric tons of garbage per day into 21 megawatts of net electricity--enough to power about 19,000 homes. Rather than burning trash to generate heat, as with an incinerator, the facility proposed by Ottawa-based PlascoEnergy Group employs electric-plasma torches to gasify the municipal waste and enlist the gas to generate electricity.


A few waste-to-energy gasification plants have been built in Europe and Asia, where landfilling is more difficult and energy has historically been more costly. But PlascoEnergy's plant would be the first large facility of its kind in North America. The company's profitability hinges on its ability to use a cooler gasification process to lower costs, as well as on rising energy and tipping fees to ensure strong revenues.

PlascoEnergy's approval marked the latest in a string of positive developments for waste gasification projects in recent weeks. Last month, Hawaii okayed $100 million in bonds to finance a waste-to-energy plant using plasma-torch technology from Westinghouse Plasma, based in Madison, PA, that is already employed in two large Japanese waste processing plants. Meanwhile, Boston-based competitor Ze-gen reported the successful ramp-up of a 10-metric-ton-per-day pilot plant in New Bedford, MA, that uses molten iron to break down waste.

Most gasification plants work by subjecting waste to extreme heat in the absence of oxygen. Under these conditions, the waste breaks down to yield a blend of hydrogen and carbon monoxide called syngas that can be burned in turbines and engines. What has held back the technology in North America is high operating costs. Plasma plants, using powerful electrical currents to produce a superhot plasma that catalyzes waste breakdown, tend to consume most of the energy they generate. As a result, the focus of plasma gasification plants has been to simply destroy hazardous wastes. "There was really no thought of being able to produce net power," says PlascoEnergy CEO Rod Bryden.

PlascoEnergy started looking at gasification for municipal solid waste five years ago, when it determined through simulation that cooler plasma torches could do the job. "The amount of heat required to separate gases from solids was much less than the amount being delivered when the purpose was simply to destroy the material," says Bryden. PlascoEnergy tested the models on its five-metric-ton-per-day pilot plant in Castellgali, Spain (jointly operated with Hera Holdings, Spain's second largest waste handler). In January, the company began large-scale trials in a 100-metric-ton-per-day demonstration plant built in partnership with the city of Ottawa.

Here's how it works. First, bulk metals are removed, and the rest of the shredded waste is conveyed to a 700 ºC gasification chamber. Most of it volatilizes to a complex blend of gases and rises toward a plasma torch operating at 1200 ºC--well below the 3000 to 5000 ºC used with hazardous wastes. The plasma reduces the complex blend to a few simple gases, such as steam, carbon monoxide, and hydrogen, plus assorted contaminants such as mercury and sulfur; subsequent cleanup systems remove the steam and mercury and scrub out the soot before the syngas is sent to an internal combustion engine generator.

The waste that doesn't volatilize forms a solid slag and drops to the bottom of the gasification chamber. The slag is then pushed to another plasma torch, which drives off remaining carbon in the slag before the slag is cooled and vitrifies. The resulting glass can be blended into asphalt road surfacing or cement.

Under its deal with Ottawa, PlascoEnergy will cover the estimated $125 million that it takes to build the plant, which could be operating within three years, while the city will pay only standard tipping fees--on the order of $60 per metric ton.

Ze-gen plans to avoid the challenge of handling complex municipal wastes by focusing first on an easier-to-handle feedstock: construction and demolition wood wastes. The company has filed seven patents on its molten metal gasification technology and waste-to-syngas process, but the equipment itself is standard for the steel industry, which uses molten iron to catalytically drive off impurities from ore. Ze-gen's pilot plant processes wood waste using a standard electrically heated steel-industry crucible full of molten iron.

Ze-gen CEO Bill Davis estimates that a full-size plant just slightly bigger than PlascoEnergy's commercial plant will produce enough syngas to create 30 megawatts of electricity, but he says that the syngas is also of sufficient quality to be used in other applications. As examples, he cites synthetic gasoline, diesel production, and refinery applications.
Continue Reading...