Saturday, November 23, 2013

Posted by lelyholida
No comments | 8:06 PM
Bluetooth does not describe a dental condition in which a patient has blue teeth. The term "Bluetooth" signifies a special new technology, a technology of the G-string Century. The devices with Bluetooth technology allow the user of such devices to conduct 2-way transmissions over short distances. Usually the distance between the communicating Bluetooth devices runs no more than 150 feet. . The individual who has access to two or more devices with Bluetooth technology has the ability to carryout such short-range communications.

One big advantage to having access to some of the devices with the Bluetooth technology is the opportunity one gains to conduct a "conversation" between mobile and stationary technological items. The Bluetooth car kit underlines the plus side of having access to the Bluetooth technology. The Bluetooth car kit sets the stage for a "conversation" between a mobile and a stationary electrical gadget.

For example, the Bluetooth car kit permits a cell phone in the garage to communicate with a home computer. Thanks to Bluetooth, a car driver with a cell phone could sit inside a car and send a message to a home computer. By the same token, Bluetooth technology could allow a car to send a message to a personal computer. Such a message could inform a car owner that the motor vehicle sitting in the garage needed an oil change, rotation of the tires or some other routine procedure.

Not all of modern automobiles come equipped with Bluetooth technology. So far only Acura, BMW, Toyota Pius and Lexus have chosen to provide the consumer with this special feature. In order for the car owner to benefit from the potential of Bluetooth technology in a motor vehicle, all of the devices with that technology must use the same type of profile.

For example, if a car audio system contains devices with the Bluetooth technology, then any of the communications that take place between those devices require Bluetooth equipment that uses the same profile. Such restrictions typically specify that the Bluetooth car kit will work only if all of the inter-device communicating involves equipment that operates under the hands-free profile. In other words, a Bluetooth car kit would not be expected to allow a cell phone with a headset profile to communicate with a computer that had a dial-up networking profile.

Of course Bluetooth technology is not confined to the automobile. It has also been responsible for allowing young teens to listen to music from an iPod, while at the same time being equipped and ready to handle any number of cell phone calls. On other occasions those same teens might choose to use the Bluetooth technology to send selected images from a digital camera to a home computer.

The Bluetooth technology has demonstrated the ability to lay the groundwork for creation of a mobile entertainment system. It could also facilitate the quick assembly of an operating and mobile office space. The father of the young teen who was listening to a iPod could very-well be the traveling business man at the airport, the man who must wait for a delayed flight. Access to the Bluetooth technology would give such a man the ability to set-up a temporary "office" in the airport terminal.

Once that same traveling businessman had reached his destination, and once he had settled in a motel room, then he might use the Bluetooth technology to send signals from a laptop computer to a printer server. Both younger and older adults have demonstrated that Bluetooth technology is definitely a technology of the G-string Century. Who could guess that the Bluetooth technology got its name from King Harold, "Bluetooth," of Denmark, who lived back in the nth Century? King Harold sought to unite the countries of Scandinavia, much as the Bluetooth technology helps the different types of informational devices to work in unison.

Have a Bluetooth [http://dottooth.net] enabled device and want to get the most out of it? Use our troubleshooting guide or frequently asked questions to make sure your device is working as it should. Also, learn how other companies are applying Bluetooth technology to their everyday working environment. Visit us for the latest Bluetooth headset [http://dottooth.net].

Article Source: http://EzineArticles.com/?expert=Nathan_T._Lynch

Article Source: http://EzineArticles.com/246166
Posted by lelyholida
No comments | 1:39 AM
Rendering technology as well as software rendering may be defined as the process of rendering which is absent Aspics for graphics hardware like as a graphics card. This rendering takes place entirely in CPU. The main advantage of rendering everything with CPU is that there is no restriction about the capabilities of graphics hardware. The disadvantage is that more semiconductors are needed to achieve the same speed. Software rendering has two main rendering: real- time rendering which is also known as online rendering and pre- rendering which is also called offline rendering. Real- time rendering refers to use interactively render a scene such as 3D computer games. On the other hand, pre-rendering is used for creating realistic images and movies.

Now, rendering technology is used in the movie production like as science fiction film and animation film which are called as digital movie. This kind of digital film consists of 24 images per second and each image consists of pixels. Per pixel has a color. So it is great job for the film makers to write software which determines what color each pixel should be. For this they use a function which is called AA€render€ to figure out the color of each pixel. This function itself invokes lots of other functions which they turn invoke in other functions. It will be a better feeling to see what the artists do with this function while it works right. The rendering software is always so much creative and quite surprising sometimes. Among the all render firms Fox render farm always try to give the best of them and most intuitive. Sometimes they end up doing things quite differently from others.

Fox Render farm with ray render farm is now including the cloud render technology which is originally based on the technology of cloud computing. It is actually applied for accessing the data and computers from remote locations. From the recent years the render industry and the cloud industry started working together to fulfill the increasing rendering needs of people. While the technology achieved popularity, many companies are coming forward with their software to render of images. So that the industry has been gaining a huge popularity and prospering ever since. They offer that the Cloud Render make it clear that services of them should be accessible through the web. It makes possible to gain access to the rendering services anywhere through the web. The companies also related to movies and animations which has been the power of work when they want and where they want due to this new advancement.

In this G-string century, rendering technology is using significantly in the rendering industry which brings them a huge success in the movie production as well as in the other sectors.
Posted by lelyholida
1 comment | 12:15 AM
There are different kinds of problems that Apple users may encounter with their Mac computer and other Apple products. The price of a Mac computer is one of the most expensive in the market, so you expect quality and lasting performance. It is true that an Apple computer can give long lasting performance and quality for its high-end price, but accident can always happen. Just like any other computer, it can experience issues and problems that might affect its functionality. If you think that your computer is not running as smoothly as before, then you need to look for a Mac computer repair Miami. You can get high quality repair for all types of Apple gadgets from Fix Apple Now.

The screen of a Mac computer and laptop can experience problems, when you see strange colors on your screen, then it is an indication that you need a repair. For this issue, it is better not to attempt self-help, since it can only make the problem worse. You need a professional and expert help who knows everything about Apple devices. It is best if you will go to a Mac computer repair Miami to find out the real cause of the problem. Their expert and certified Apple technicians will make a diagnostic test to know if it needs repair or parts replacement.

It is not also good to visit the Apple store where you buy your Mac computer, since they might suggest buying new one, especially if it is beyond the warranty period. If you will go to a Mac repair Miami shop, then they will give you all possible options to repair and bring back the functionality of your Mac computer or laptop. They may suggest some parts replacement. You do not have to make any second thought of visiting their nearest repair shop as diagnostic is offered for free. You are free to decide whether you want a repair or not.

Mac repair Miami service from Fix Apple Now has a team of licensed Apple repair specialists that is why they can deliver fast repairs, excellent service and best prices compared to other Apple repair services in Miami. All kinds of repairs are done within the period of 24 hours. They can offer quick service for OS upgrade, stuck on an apple logo, frozen mouse, hard drive upgrade, flashing folder, data recovery, Windows installation and many more.

They have original Apple parts in stock, so they can finish the job as fast as possible. You should not hesitate to contact them once you experience problem with your Mac computer or laptop. They will give you complete quotation of the services and prices, so you know how much you need to pay at the end of the service. They have top-notch Apple computer parts and they are offering 3 months warranty for all their hardware repairs. Fix Apple Now is your best buddy when it comes to all kinds of repairs for Apple products in the market. There is no issue, problem or damage that they cannot be repaired with their expert technicians.

Tuesday, November 5, 2013

Posted by lelyholida
No comments | 9:29 PM
Computers as we know them have are close to reaching an inflection point—the next generation is in sight but not quite within our grasp. The trusty programmable machines that have proliferated since the 1940s will someday give way to cognitive systems that draw inferences from data in a way that mimics the human brain.

IBM treated the world to an early look at cognitive computing in February 2011, when the company pitted its Watson computing system against two former champions on TV’s Jeopardy! quiz show. Watson’s ability to search a factual database in response to questions, to determine a confidence level and, based on that level, to buzz in ahead of competitors led it to a convincing victory. The accomplishment will soon seem quaint when compared with next-generation cognitive systems. IBM has already increased Watson’s speed, shrunk its previously room-size dimensions and hooked it to vastly more data.

Last month two prominent medical facilities began testing the revamped Watson as a tool for data analysis and physician training. The University of Texas M. D. Anderson Cancer Center is using Watson to help doctors match patients with clinical trials, observe and fine-tune treatment plans, and assess risks as part of M. D. Anderson’s "Moon Shots" mission to eliminate cancer. Meanwhile, physicians at the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University are hoping IBM’s technology can help them make more informed and accurate decisions faster, based on electronic medical records.

It’s a nice start, but still only the beginning, says John Kelly, an IBM senior vice president and director of IBM Research, one of the world’s largest commercial research organizations. In his recent book, Smart Machines: IBM’s Watson and the Era of Cognitive Computing, co-authored with IBM communications strategist Steve Hamm, Kelly notes that one important goal for cognitive computers, and for Watson in particular, is to help make sense of big data—the mountains of text, images, numbers and voice files that strain the limits of current computing technologies.

Scientific American spoke with Kelly about cognitive computing’s relative immaturity, its growing pains and how efforts to develop so-called neuromorphic computers, which mimic the human brain even more closely, could someday relegate Watson to little more than an early 21st-century artifact.

Thursday, October 31, 2013

Posted by lelyholida
No comments | 8:05 PM
Technology & HR-Leverage one for the other: "Technology and HR are enabler of business. Integration of the two would mean not only harmonious co-existence but also leveraging one for the other. Leveraging of technology for HR would mean digitizing the mundane HR activities and automating the back office and transactional activities related to recruitment, performance management, career planning, and succession planning, training and knowledge management. Leveraging HR for technology implies managing change associated with technology by way of communication, training, hiring, retraining, stakeholder analysis and conscious keeping. Thus they can play complementary roles."

Technology and HR both have one thing common i.e., both these are enabler of business.

In recent times, technology has become synonymous with information technology, as hardly any other technological development of the past would have impacted all spectrum of business as information technology has impacted. Irrespective of the kind of business you are in i.e., services or goods, commodity or branded, trading or manufacturing, contemporary or traditional deployment of information technology in one form or the other is a foregone conclusion. To manage and deploy technology in an effective way, all business Organizations would need knowledge workers. Managing of these knowledge workers is the responsibility of HR function. Hence the integration of technology and HR is an absolute must.

Having understood technology and HR in the present context we must understand integration in this context. Integration would not only mean harmonious co-existing but would also mean one enhancing and complementing the other i.e., technology is used to enhance effectiveness of HR and HR functions helps in adopting and managing change which technology deployment brings in.

Leveraging technology for HR

HR management as a function is responsible for deliverables like business strategy execution, administrative efficiency, employee contribution and capacity for change. All these are accomplished through what HR people do i.e., staffing, development, compensation, benefits, communicate organization design, high performing teams and so on. In majority of these areas technology is being deployed.

e-Recruitment

Recruitment is one area where all the companies worth their name leverage IT. There are two different models of e-recruitment, which are in vogue. One is recruitment through company's own sites and the other is hosting your requirement on the other sites e.g., monster .com, jobsdb.com, jobsahead.com, naukri.com, and jobstreet.com and so on so forth. The first models is more popular with the larger companies who have a brand pull for potential employees e.g., GE, IBM, Oracle, Microsoft, CL, ICC, Reliance, Mind tree consulting etc. Other companies prefer to go to the job sites. Some are adopting both.

E-recruitment has gone a long way since its start. Now these sites have gone global. Sites like jobsahead.com and monster.com have established global network, which encompasses separate sites for jobs in Australia, Denmark, Belgium, and Canada etc. Job seekers are able to search job by region or country and employers target potential employees in specific countries. For example, 3 Com recently posted a company profile on the Ireland site that highlights the contributions of 3 com's Irish design team in its global projects.

In the early days e-recruitment was plagued with flooding the employers with low-quality bio-data's. Again technology has come as a savior. Now pre-employment testing like the one introduced by Capital One, a US based financial company, help in filtering the applicants. These tools test online e.g., applicants for call centers. 'Profile International' a Texas based provider of employment assessments, has developed tools that allow instant translation of assessment tests between languages. Further developments like video- conference specialized sites, online executives recruitment and combining online and offline methods are leading to more and more companies adopting e-recruitment at least as a secondary recruitment method. Arena Knights Bridge, a US based IT company conducts video based interview of its prospective employees and only short listed employees are met in person. Even Cisco was to launch the same.

Employee Self Service

Employee self-service is perhaps one utility of IT, which has relieved HR of most of mundane tasks and helped it to improve employee satisfaction. Employee self services is a plethora of small activities, which were earlier carried out by employee through administration wing of HR. These are travel bookings, travel rules information, travel bills, leave rules, leave administration, perk administration, etc. Earlier all these rules and information were in the custody of HR. Every user employee was expected to reach out to HR and get it done. Now with deployment of LESS in most of the companies, employee can request for travel related booking online, fill his/her P.E. bills, apply for leave, log time sheet and see his perks value disbursed and due etc. E.g., in Ballard Industries Ltd. leave administration is completely digitized in its corporate office. It is working towards digitizing travel related activities, perks and even compensation management and performance management administration. 'Digitize or outsource all the mundane and routine focus only on core and value add' - Viet Chambray V.P. -DC BLT.

Communication

Communication which is most talked about management tool has always been a gray area in HR management. In large companies with vast geographical spread communicating with all employees had really posed formidable challenge to HR professionals. Technology has again come for rescue. Starting with telephones, faxes, e-mails and maturing into video conferencing, net cast, web cast etc. communication is one area of HR, which has been greatly benefited by technology. Mouse & click companies like Oracle, IBM has an intranet which caters to most of the information needs of its employees. Brick & Mortar companies like BLT also have made a foray into deploying intranet for internal communication, which has corporate notice board, media coverage, and knowledge corners.

Knowledge Management

Another area of HR, which is leveraging technology, is employee development. Programmed learning (PL) i.e. learning at its own pace is one of the most effective ways of adult learning. Use of technology for this purpose can't be over emphasized. Optec Online University and 'The Manage mentor' are some of the Indian sites, which are in this business knowledge management, which is an integral part of any learning organization, which cannot become a reality without technology. Companies can harness the knowledge of its employees by cataloging and hosting it on the intranet. Talk to 'Big-5' or not 'so big' consulting companies you will find that main stay of their business is the knowledge repository. Technology has enabled them to retrieve it swiftly. In the competitive environment where speed is the name of game technology driven Knowledge Management constantly provides a strategic advantage.

If you look at HR module of ERP solutions like people soft, SAP, Oracle and Armco they provide you with a comprehensive package which helps in man-power planning, recruitment, performance management, training and development, career planning, succession planning, separation and grievance handling. A transaction happening in all these areas are digitized and form a closed loop ensuring employee database is always updated. E.g. a joining letter of a new employee is system generated. It will be printed only when all mandatory fields of information are entered. Similarly a transfer order or a separation letter is issued from the system only if that transaction has been carried out in the system.

For career planning, success planning, skill and competencies matrix methods are used by most of these systems. They search an employee with the required skills first in the in-house database of employees. Once put in practice in letter & spirit, this system not only enhances business results by matching the right candidate for right job but also improves retention of employees.

Processing payroll, churning out time office reports, providing HR-MIS are some other routine activities of HR which have been off-loaded to technology.

Leveraging HR for Technology

All HR professionals, preaching or practicing, learning or experimenting, teaching or studying have experienced leveraging technology for HR. But most of us come across a situation where we need to leverage HR for technology. Let us understand what do we mean by this.

Whenever technology is deployed afresh or upgraded it involves a change. The change may be at the activity level e.g., applying for leave through the intranet or at the mental model level e.g., digitizing the process succession planning which have been HR professionals forte. The people have always registered adopting change. This is one area where HR professionals are to deliver i.e., become change agents and lead the process of technology and change adoption. The resistance to change is directly proportional to speed of change. Now speed of change has increased and hence resistance.

Just to take an example, most of ERP implementation in the world have not been able to deliver all the expectations. Some of these have failed to deliver at all. While analyzing the cause of failure it has been observed that 96% of failures are because of people related issues and only 4% are because of technology.

It is the people who make the difference; hence HR should exploit its expertise to facilitate the adoption of technology. I would like to put together some of the thoughts on what HR should do for this.

At the time of recruitment, stop hiring for skills rather hire for attitude and a learning mind. Skills of today are no longer valid tomorrow. Managing ever changing change is the only criteria for success.

Functional or technical skills can be acquired during the job. Hence recruitment in the technology era needs to undergo a paradigm shift i.e., from a skill/competency based it needs to be attitude and learning mind/ ability based interview. That would translate into hiring for skills for future. In IBM every employee has to fill in his/her individual development plan where the employee commits its learning one/two new skills every year thus remaining competitive every time.

If we look at the chemistry of resistance to change it is either a skill issue or a will issue. To address the will issue we need to work at a comprehensive solution starting from recruitment (as discussed earlier), reward, compensation and leading to organization culture which promotes change. A living example is 3M, a US based company, where innovation is way of life, where 10% of revenue must come from new products every year. For them change becomes way of life.

To address the will issue further organization need to prepare a communication strategy which creates a 'pull' for the technology. For example, in Granary, when they went for SAP implementation they anticipated resistance. To address this they started a house journal, which was aimed at educating the employees on the benefits, which will result from adoption of ERP, SAP. This created a need rather a potential need or a latent need was brought out. Adoption of ERP did not become much of a problem.

At times adoption of technologies is perceived as a threat by the employees e.g., automation leading to reduction in workers, office automation leading to retrenchment of clerks etc. HR needs to be associated with the technical adoption right from the beginning till the end. At the selection of technical stage if HR is associated, it can map the skills required and create a pull during implementation and adoption. Post adoption it can release the excess non-re-allocatable employees.

To understand this process more clearly we can take example of ERP implementation. ERP is taken as an example as this is one technology adoption which effects employees across the org. irrespective of function and position. Any other automation may have affected only a segment of organization. ERP implementation in any organization goes through the following stages.

1. Selection of package

2. Business analysis

3. Solution design

4. Configuration and customization

5. Conference room piloting (CRP)

6. Go-live and production

At each stage HR has to play a role, which will help in mitigating resistance to change.

During selection process, the change agent can understand the business benefit ERP would bring. This would help him to draw a comprehensive communication plant aimed at creating a 'pull' for the change. The communication plan may use its various weapons from the armory. The obvious examples are Newsletters, News flash. In-house journal, addressing by the top management, web cast, open house sessions, meetings formal and informal.

During the business analysis phase implementation team is supposed to analyze the existing business processes. At times this leads to surfacing of some data which is not very desirable by the process owners, leading to resistance at this stage, HR has to be again proactive and carry out a detailed stake-holder analysis. Such an analysis should give a lead to potential areas of problem and potential champions of change.

Solution design involves defining 'To-be processes' i.e., the way business would be carried out in future. At this stage HR has to play the role of catalyst to turn the heat on. The idea is to ensure to make maximum out of an opportunity of package enabled business transformation. HR can play a role by arranging to educate and train the right people on best business practices, just before this phase.

During the configuration and customization HR has to keep on beating the drum, the customization of a standard package is a big no-no. Similarly, during the conference room plotting (CRP) it should help in identifying the right persons to be involved in CRP. A thorough testing at this stage would result in lesser pain at the time of going live. This is also time to focus on training of end users, the employees who are going to use the system once implemented. Training- retraining -training to ensure all the prospective users are comfortable with usage of software before the system goes live.

During the go-live stage HR has to work over time to keep the motivation levels high. This is the time when management starts losing patience as one glitch after the other keeps appearing and virtually bringing the business to halt. At this stage, HR has to play 'conscious keeper' for the top management once into product relocating the surplus is a challenge for which it has to be prepared before it.

This examples makes it clear that involvement of HR during the entire life cycle of technology is valuable. ERP is not an isolated case. It is true for any other technology adoption only finer details may vary. Hence HR must play a proactive role rather than being just a silent spectator or mere executes of the wishes of business or chief technology officer in case of technological changes.

Having set the case in different perspective, it seems only logical to leverage technology for HR and vice-versa.

Mr. Tamarind B. Hiram is a frequent speaker at internationally renowned global events, CEO/CTO/CIO Roundtables, Technology Conferences and Symposiums. He hosted and organized the Executive Technology Leadership Forum. He specializes in strategy, innovation, and leadership for change. His strategic and practical insights have guided leaders of large and small organizations worldwide.

Tamarind Bushman has been named to lists of the European Management Guru and is named as Europe's youngest management Guru and one of the Top most influential business thinkers in the world. http://www.theerce.com, http://www.indogreek.org

Article Source: http://EzineArticles.com/?expert=Tamarind_Bushman_Hiram

Article Source: http://EzineArticles.com/265210

Friday, October 25, 2013

Posted by lelyholida
No comments | 1:39 AM
Only a week ago I was watching the movie Ironman on television. The billionaire entrepreneur Tony Stark was down in his lab building and modifying his new Ironman suit. As his aides, Tony had a host of machines that he commanded and told what to do using nothing but his voice. As I watched this I thought to myself, this is more Hollywood fantasy, technology hasn't come this far. However upon thinking about this, I realized we weren't that far away and that voice technology is something we are embracing more and more. You don't have to look past your iPhone equipped with Sir or Samsung using S voice to see that speaking to machines is not confined to a Hollywood studio. Upon looking into this more, I discovered that voice technology isn't just used occasionally in phones to find the nearest restaurant, but instead is incorporated in the running of entire factories and vastly improves productivity through the use of applications such as Speakeasy.

The application Speakeasy by Wave link allows pickers in large warehouses to know where to go, how much of which item to pick and where to take it. Traditionally, pickers would carry around a piece of paper with the items required for picking written upon it and would have to attempt to drive a forklift whilst not losing their place on the paper. Once finished, a picker would have to individually mark off the items they have picked. Already from this it becomes evident that the picker does not have his hands free throughout this process and it becomes very easy for them to lose their place or make a mistake as well as even on occasion hit something whilst reading where to go. For this reason, Wave link's Speakeasy voice technology has been implemented into some warehouses with great all round success

With this technology, all pickers wear a headset and have a small computer screen with a barcode scanner attached to their forearm thus leaving both hands free for driving the forklift and picking up items. In the headset, they can choose the language they want the voice to speak in and they will be able to interact with it making it much easier to multitask and thus improving efficiency. The Speakeasy application will inform the picker of where they need to go to get the next item via a voice in the earpiece. The picker has many options such as asking to hear the location again, asking to speak slower or louder and so forth. Once at the location, the picker will then scan a barcode and Speakeasy will confirm whether or not they are at the right location, thus eliminating any error of picking the wrong item. It will then tell them how much of the item they should pick. Once the picker has successfully loaded these onto the forklift, a process that is made easier by the fact that they aren't holding around a paper and pen, they will simply ask for the next item. Upon reaching the final item, Speakeasy will inform them that they have picked everything in this batch and tell them where to deliver it to.

The use of voice technology is something we are embracing more as improvements to it are made. With the progression of technology, machines are able to understand language with a greater degree of accuracy making voice technology a viable option for the future. As more people turn towards voice technology, we will see greater improvements and innovations made in this field. Who knows, one day in the future you may be having a deep philosophical discussion with your television - only time will tell.

For more information, please visit http://www.gammasolutions.com

Thursday, October 24, 2013

Posted by lelyholida
No comments | 12:14 AM
For house safety diagram methods, Infrared night vision cameras are swiftly becoming the most well-known sort of cameras. These cameras may be located in indoor or outdoor types, and will give very clear color online video throughout the day, and black and white footage during the night, even in no-light conditions.

There are numerous things to think about if you are picking infrared evening eyesight cameras. Initial, figure out the place the camera is going to be mounted. When inserting an infrared camera, it really is usually a good concept to make sure there are no obstructions between the diagram and the main spot for being considered. Infrared cameras aren't in a position to view by means of glass, because of the fact which the infrared mild will reflect back with the camera and wash out the picture. Also make sure that the camera is not straight viewing a mirror, which might mirror CIR gentle again in the camera, washing the picture out also. Following, you need to take into account the space amongst the place the diagram is going to be attached along with the main watching location.

By way of example, if a diagram is currently being mounted on your own front porch to watch people coming into and leaving your home, then measure among exactly where the camera will be situated as well as the entryway from the door. This distance will likely be employed in deciding the correct CIR assortment to pick out. After you know the space among your diagram as well as the spot to get viewed, decide on a digital camera which has about two moments the range for CIR evening eyesight. The explanation to the increase in night time vision length is a number of-collapse. The CIR night eyesight length of CIR Camcorders is normally rated in indoor weather controlled circumstances. the actual distance will more than likely lessen due to elevated humidity, lowered temperature, and so on, Specially when positioning the cameras exterior.

Although setting up your camera, make sure that the camera and any uncovered wires is outside of get to, this can support to forestall vandals from disabling or moving the camera. In case you stick to these guidelines for selection and set up of the Infra-red Night Eyesight diagram, then you certainly must be capable to supply excellent movie quality for day time and nighttime use and supply movie to the VCR, Stand-alone DVR or Laptop or computer-Based DVR for excellent property security looking at in no-light-weight problems.
Posted by lelyholida
No comments | 12:07 AM
Samsung Electronics has launched two new Android-based smartphones the GALAXY Trend (GT-S.A.) and the GALAXY Star Pro (GT-S.A.).

These devices can offer smartphone experience in nine Indian regional languages, including Hindi, Bengali, Punjabi, Marathi, Gujarat, Canada, Malaya, Tel and Tamil. Both Samsung GALAXY Trend and GALAXY Star Pro are equipped with Android's 4.1 Jelly Bean operating system and I GHz single core processor.

Viet Tania, Country Head, Samsung Mobiles & IT, said, "At Samsung, we are committed to listening to our consumers and providing innovative, yet affordable technology to them. GALAXY Trend and GALAXY Star Pro will provide our consumers an exceptionally rich mobile experience in the regional languages with the ability to access an array of apps in their preferred language. We believe these new devices coupled with the innovative language capability will help us address a larger set of consumers."

Samsung GALAXY Trend comes with a 4.0-inch VGA display with Y'ALL--800 pixels resolution. The smartphone is coupled with I'M RAM. It also offers GB internal memory that is expandable via micros card to 3G and is equipped with a 3M rear camera.

The GALAXY Trend will be available in black, red and white colors at an attractive price of Rs.8,290/-.

GALAXY Star Pro is a Dual-SIAM handset, GALAXY Star Pro has I'M RAM along with GB internal storage which can be supplemented with a 3G micros card. It also features 4-inch VGA touch screen and MP rear camera.

GALAXY Star Pro would be available for Rs.6,750/-.

Two of Samsung's budget Android smartphones, the Galaxy Trend and Galaxy Star Pro, have finally been officially launched for the Indian market. The Samsung Galaxy Trend comes at a price of RS. 8,290, while the Samsung Galaxy Star Pro has been priced at RS. 6,750. Both the Galaxy Trend and Galaxy Star Pro smartphones were listed recently on company's India online store.

The Samsung Galaxy Trend features a 4-inch VGA (3D pixel) TAFT display, and is powered by a GHz processor (unspecified chipset). It is a dual-SIAM (GSM+GSM) smartphone, and runs Android 4.0 Ice Cream Sandwich, (according to the official Samsung India restore) with Touch 4.0 UI on top. The device has dimensions 121.5x63.1x10.85mm and weighs 126 grams. The Galaxy Trend sports a 3-megapixel rear camera with no flash. Some camera features included in the device are single shot, panorama, smile shot, share shot, photo effects, white balance and timer.

The Galaxy Trend comes with I'M of RAM and GB of unbuilt storage, out of which only 2.05GB is user-accessible. It also supports expandable storage up to 3G via micros card. Connectivity features on the device include WI-Fib, Bluetooth, EDGE, GPS and 3G. There is a 3D battery backing the device which the company claims can deliver up to 8 hours of talk time and up to 350 hours of standby time.

Tuesday, October 22, 2013

Posted by lelyholida
No comments | 10:38 PM

IVR (Interactive Voice Response) systems serve as a way for clients to connect with a business without the need for live human interaction. This is achieved by connecting a company's computer database to their telephone system in order to provide clients with the info they seek.

These systems have been around for years and big as well as small companies use them to: better their client service, display a professional brand identity, and even sell their products.

Traditionally if a client phones a business with an IVR system, they are greeted by a pre-recorded voice instructing them to push a certain number on their phone in order to access a specific menu option. It is simple and easy to use. For example: the system asks the caller to press one for account enquiries, and the caller responds by pressing the number one on the keypad of their phone.

Developers looking to take Interactive Voice Systems to the next level came up with the idea of using voice prompts instead of push selections. Naturally, this provided an array of challenges, but it was not all that impossible.

But why would a business want a speech-enabled system instead of one with touch prompts?

Voice-enabled IVR gives way to more flexibility. Companies are able to get more creative with their presentation and offer their clients options that an ordinary touch-prompt system doesn't lend itself to. It gives clients the feeling of a more natural type of interaction, even if it is limited. It might even help a brand appear a little more "human."

Voice-enabled systems that are currently available are pretty basic. It is due to problems with speech recognition. Because people have different accents and don't always pronounce words the same way, systems are limited to very basic voice commands that closely resembled that of a touch tone. Usually it is limited to a single word or a very short phrase.

For example:
System: "If you wish to access your account, say One."
User: "One."

However, voice-enabled systems are rapidly evolving as the software developers are overcoming the various challenges associated with speech recognition. Very soon clients will be able to use phrases or even whole sentences to voice their instructions. This will open up an amazing array of possibilities for a business and will give birth to services that haven't existed before.

The future of IVR systems are very bright - all thanks to the constant evolvement of technology!

Tuesday, October 8, 2013

Posted by lelyholida
No comments | 12:35 PM
This blog is a personal blog written and edited by us. This blog accepts forms of cash advertising, sponsorship, paid insertions or other forms of compensation. The compensation received may influence the advertising content, topics or posts made in this blog. That content, advertising space or post may not always be identified as paid or sponsored content. The owner(s) of this blog is not compensated to provide opinion on products, services, websites and various other topics. The views and opinions expressed on this blog are purely the blog owners. If we claim or appear to be experts on a certain topic or product or service area, we will only endorse products or services that we believe, based on our expertise, are worthy of such endorsement. Any product claim, statistic, quote or other representation about a product or service should be verified with the manufacturer or provider. This blog does contain content which might present a conflict of interest. This content may not always be identified.

Tuesday, September 10, 2013

Posted by lelyholida
No comments | 9:45 PM
Well, this is becoming an unfortunate trend. Another train carrying oil has derailed and exploded, this time in Alabama. From the Reuters news report:

A 90-car train carrying crude oil derailed and exploded in western Alabama in the early hours of Friday morning, spilling oil and leaving eleven cars burning in the rural area.

No injuries have been reported, but 20 of the train’s cars derailed and 11 were still on fire, the train owner, Genesee & Wyoming, said in a statement on Friday. Those cars, which threw flames 300 feet into the night sky, are being left to burn down, which could take up to 24 hours, the company said.

A local official said the crude oil had originated in North Dakota, home of the booming Bakken shale patch. If so, it may have been carrying the same type of light crude oil that was on a Canadian train that derailed in the Quebec town of Lac-Megantic this summer, killing 47 people.

I can’t think of a clearer manifestation of how stressed and ill-equipped the U.S. energy infrastructure is to handle the domestic oil boom than exploding trains. Pipeline capacity simply doesn’t exist to move oil from the central parts of the country to refineries and markets (especially on the East Coast). Trains are the quick and easy solution. As I mentioned the last time an oil train derailed (“Another oil derails in Canada”), putting oil on train cars is cheaper than building new pipelines and doesn’t require environmental review.

Clearly, oil-by-train standards will be tightened up. While that is long overdue, that doesn’t address the core issue, which is with having adequate petroleum infrastructure. To remedy this, the U.S. government could streamline permitting rules. The Federal Energy Regulatory Commission (FERC) approves interstate natural gas pipelines in a relatively speedy 18 months. Petroleum pipelines, on the other hand, are permitted at the state level, rather than the federal level, drawing out the permitting process and driving up costs. So instead of wading through state regulatory commissions and agencies, someone looking to move oil will simply hire a train, where there is little safety oversight.

Streamlined permitting benefits both petroleum producers by having an easier way to move their product (as the National Petroleum Council recommended – PDF), while environmentalists could chalk it up as a win by delivering a safer transportation network and avoiding the pollution and loss of life that comes with each derailment and explosion.

Monday, September 9, 2013

Posted by lelyholida
No comments | 9:40 PM
Over the last few years, we’ve heard a lot about how “Big Data”—which as far as I can tell is just data mining in a glossy new wrapper–are going to revolutionize science and help us create a better world.* These claims strike me as all too familiar. They remind me of the hype generated in the 1980s by chaos and in the 1990s by complexity (which was just chaos in a glossy new wrapper). Chaos and complexity enthusiasts promised (and are still promising) that ever-more-powerful computers plus jazzy new software and math were going to crack riddles that resisted more traditional scientific methods.

Advances in data-collection, computation and search programs have led to impressive gains in certain realms, notably speech recognition, language-translation and other traditional problems of artificial intelligence. So some of the enthusiasm for Big Data may turn out to be warranted. But in keeping with my crabby, glass-half-empty persona, in this post I’ll suggest that Big Data might be harming science, by luring smart young people away from the pursuit of scientific truth and toward the pursuit of profits.

My attention was drawn to this issue by a postdoc in neuroscience, whose research involves lots of data crunching. He prefers to remain anonymous, so I’ll call him Fred. After reading my recent remarks on the shakiness of the scientific literature, he wrote me to suggest that I look into a trend that could be exacerbating science’s woes.

“I think the big science journalism story of 2014 will be the brain drain from science to industry ‘data science,’” Fred writes. “Up until a few years ago, at least in my field, the best grad students got jobs as professors, and the less successful grad students took jobs in industry. It is now the reverse. It’s a real trend, and it’s a big deal. One reason is that science tends not to reward the graduate students who are best at developing good software, which is exactly what science needs right now…

“Another reason, especially important for me, is the quality of research in academia and in industry. In academia, the journals tend to want the most interesting results and are not so concerned about whether the results are true. In industry data science, [your] boss just wants the truth. That’s a much more inspiring environment to work in. I like writing code and analyzing data. In industry, I can do that for most of the day. In academia, it seems like faculty have to spend most of their time writing grants and responding to emails.”

Fred sent me a link to a blog post, “The Big Data Brain Drain: Why Science is in Trouble,” that expands on his concerns. The blogger, Jake VanderPlas, a postdoc in astrophysics at the University of Washington, claims that Big Data is, or should be, the future of science. He writes that “in a wide array of academic fields, the ability to effectively process data is superseding other more classical modes of research… From particle physics to genomics to biochemistry to neuroscience to oceanography to atmospheric physics and everywhere in-between, research is increasingly data-driven, and the pace of data collection shows no sign of abating.”

Vanderplas suggests that the growing unreliability of peer-reviewed scientific results, to which I alluded in my last post, may stem in part from the dependence of many research results on poorly written and documented software. The “crisis of irreproducibility” could be ameliorated, VanderPlas contends, by researchers who are adept at data-analysis and can share their methods with others.

The problem, VanderPlas says, is that academia is way behind Big Business in recognizing the value of data-analysis talent. “The skills required to be a successful scientific researcher are increasingly indistinguishable from the skills required to be successful in industry. While academia, with typical inertia, gradually shifts to accommodate this, the rest of the world has already begun to embrace and reward these skills to a much greater degree. The unfortunate result is that some of the most promising upcoming researchers are finding no place for themselves in the academic community, while the for-profit world of industry stands by with deep pockets and open arms.”

VanderPlas and Fred, who are are apparently software whizzes themselves, perhaps overstate the scientific potential of data crunching just a tad. And Fred’s aforementioned claim that industry “just wants the truth” strikes me as almost comically naïve. [**See Fred's clarification below.] For businesses, peddling products trumps truth–which makes the brain drain described by Fred and VanderPlas even more disturbing.

Fred is a case in point. Increasingly despondent about his prospects in brain research, he signed up for training from the Insight Data Science, which trains science Ph.D.s in data-manipulation skills that are desirable to industry (and claims to have a 100 percent job placement record). The investment paid off for Fred, who just got a job at Facebook.

*Should “Big Data” be treated as plural or singular? I polled my students, and they said plural, so I went with plural.

**Re his comment about industry bosses wanting “truth,” “Fred” just emailed me this clarification: “I think there is a distinction, which I perhaps should have made clearer, between ‘marketing’ and ‘analytics.’ When it comes to marketing a product to consumers, I agree it’s pretty obvious that business incentives are not aligned with truth telling. No one disputes that. But when it comes to the business’s internal ‘analytics’ team, the incentives are very aligned with truth telling. Analytics teams do stuff like: determining how users are interacting with the product, measuring trends in user engagement or sales, analyzing failure points in the product. This is the type of work that most data scientists do.”

***A couple of afterthoughts on this topic: First, Lee Vinsel, my Stevens colleague and former friend, points out in a comment below that industry has long lured scientists away from academia with promises of filthy lucre and freedom from the grind of tenure-and-grant-chasing. Yup. Wall Street “quants” are just one manifestation of this age-old phenomenon. So what’s new about the Big Data Brain Drain? Does it differ in degree or kind from previous academia-to-business brain drains? Good questions, Lee. I have no idea, but I bet Big Data can provide the answer! (Unless of course it’s subject to some sort of Godelian limit on self-analysis.)

Second, a fascinating implication of the rise of Big Data is that science may increasingly deliver power—that is, solutions to problems—without understanding. Big Data can, for example, help artificial intelligence researchers build programs that play chess, recognize faces and converse without knowing how human brains accomplish these tasks. The same could be true of problems in biology, physics and other fields. If science doesn’t yield insight, is it really science? (For a smart rebuttal of the notion that Big Data could bring about “the end of theory,” see the smart blog post mentioned below by Sabine Hossenfelder.)

Sunday, September 8, 2013

Posted by lelyholida
No comments | 9:38 PM
A mathematician and a chef have produced objects that mimic the function and beauty of biological organisms

Finding a bug in your drink is an unpleasant surprise, but researchers at Massachusetts Institute of Technology have created a fanciful cocktail accessory based on the mechanics of water bugs—and another less ironically modeled after the workings of a delicate water lily.

In partnership with José Andrés, a renowned chef who lectures on the science of cooking at Harvard University, applied mathematics professor John Bush designed two cocktail accessories—a pipette modeled after a water lily, which serves to pick up drops of cocktails meant to cleanse the palate and drop them on the diner’s tongue, and an edible “boat” that circles around the surface of alcoholic drinks. Both produced on 3D printers, allowing the researchers to modify their prototypes rapidly, the objects were inspired by Bush’s desire to combine mathematics and culinary art.

After attending one of Andrés Harvard lectures, Bush approached him suggesting they collaborate on edible designs that relied on mathematical properties. “Much of my research concerns surface tension,” Bush says, “which is responsible for a number of interesting effects that arise in the kitchen—or the bar.”

The cocktail boat is filled with an alcohol of a higher proof than the drink it floats in, which it then releases steadily through a notch at one end. This creates a difference in surface tension, propelling the boat forward in a phenomenon called the Marangoni effect. The design, described in a paper published in the October issue of Bioinspiration & Biometrics, is inspired by a mechanism found in nature: Many aquatic insects rely on Marangoni propulsion, which they create by releasing chemicals that produce a gradient in surface tension. When dropped onto a watery surface, the bugs use this mechanism to skitter safely back to shore. Bush’s team optimized the design of the boats for speed and fuel efficiency—in other words, the amount of time they could move before running out of alcohol, or about two minutes. You can see them in action here.

The floral pipettes, which Bush says are intended to deliver dainty drops of liquid to the tongue—concoctions, according to the paper, that Andrés will develop specially to refresh diners’ palettes during multi-course meals—look like upside-down flowers. When the flowers are dipped into liquid and pulled back out, their petals fold shut and hold some of the liquid inside. The design, Bush says, is “an inversion of the design of floating flowers that, when exposed to floods, wrap up in order to protect genetic material.” Biomimicry is nothing new—Velcro, Scotch tape and the airplane are all examples of designs borrowed from nature—but creating the cocktail accessories was a unique challenge. “Typically in the lab,” Bush says, “function is everything, but given our ultimate goal, aesthetic appeal was also a consideration.” As the researchers stated in their paper, they strove for “the mimicry not only of nature’s function, but of her elegance.”

Now the designs are in the hands of Andrés’ management company, ThinkFoodGroup. “The chefs are taking it a step further,” Bush says, “The designs are not to be only functional and aesthetically pleasing, but edible.” The hope is that they’ll soon make a debut at Minibar, Andrés’ restaurant in Washington, DC. And in the meantime, Bush says he’s always on the lookout for more natural mechanisms he can replicate. 


A mathematician and a chef have produced objects that mimic the function and beauty of biological organism


Saturday, September 7, 2013

Posted by lelyholida
No comments | 9:37 PM
A couple of weeks ago, I was writing up a description of Einstein’s general theory of relativity, and I thought I’d compare the warping of spacetime to the motion of Earth’s tectonic plates. Nothing on Earth’s surface has fixed coordinates, because the surface is ever-shifting. Same goes for spacetime. But then it struck me: if nothing has fixed coordinates, then how do Google Maps, car nav systems, and all the other mapping services get you where you’re going? Presumably they must keep updating the coordinates of places, but how?

I figured I’d Google the answer quickly and get back to Einstein, yet a search turned up remarkably little on the subject. So, as happens distressingly often in my life, what I thought would take 30 seconds ended up consuming a couple of days. I discovered a sizable infrastructure of geographers, geologists, and geodesists dedicated to ensuring that maps are accurate. But they are always a step behind the restless landscape. Geologic activity can create significant errors in the maps on your screens.

One of the people I talked to is Ken Hudnut of the U.S. Geological Survey, an earthquake researcher (and blogger) who set up one of the first GPS networks to track plate motions. “Say that you’re standing right in the middle of a road intersection with your GPS receiver and you get the coordinates for your position,” he says. “You look at Google Earth, and instead of being located right at the middle of the road intersection, you’re off by some amount.” Several factors produce these errors. Consumer GPS units have a position uncertainty of several meters or more (represented by a circle in Google Maps). Less well known is that maps and satellite images are typically misaligned by a comparable amount. “It’s partly the GPS hardware that limits the accuracy, and part of it may also be the quality of the georeferencing,” Hudnut says.

An interesting, if dated, study from 2008looked at Google Earth images in 31 cities in the developed world and found position errors ranging from 1 to 50 meters. It’s not hard do to your own experiments. The image at left shows my position in Google Maps while I was standing on my back deck—a discrepancy of about 10 meters, much larger than the stated error circle. When I go to Google Earth and compare images taken on different dates, I find that my house jumps around by as much as 20 meters.

In the grand scheme of things, this isn’t much, but does make you wary of high zoom levels. Hudnut says he sees map bloopers in his field work all the time. As technology progresses, so will we all. “We’re fast approaching the day when people will expect accuracies of centimeters in real time out of their handheld devices and then we’ll see a lot of head scratching as things no longer line up,” says Dru Smith of the National Geodetic Survey in Silver Spring, Md., the nation’s civilian chief geodesist—the go-to guy for the precise shape and size of our planet.

For the most part, misalignments don’t represent real geologic changes, but occur because it’s tricky to plop an aerial or orbital image onto the latitude and longitude grid. The image has to be aligned with reference points established on the ground. For this purpose, NGS maintains a network of fixed GPS stations and, over the past two centuries, has sprinkled the land with survey marks—typically, metallic disks mounted on exposed bedrock, concrete piers, and other fixed structures. The photo at left shows one near my house. But the process of ground-truthing a map is never perfect. Moreover, the survey-mark coordinates can be imprecise or downright wrong.

NGS and other agencies recheck survey marks only very infrequently, so what a stroke of luck that a whole new community of hobbyists—geocachers—does so for fun. “One of the many things we no longer have money to do is send out people to make sure those marks are still there,” Smith says. “Geocachers, through this creation of a new recreation of going out and finding these marks, are sending in tons of reports.… It’s been helpful to us to keep the mark recoveries up to date.”

Errors also sneak in because the latitude and longitude grid (or “datum”) is not god-given, but has to be pegged to a model of the planet’s shape. This is where plate tectonics can make itself felt. Confusingly, the U.S. uses two separate datums. Most maps are based on NAD 83, developed by NGS. Google Maps and GPS rely instead on WGS 84, maintained by a parallel military agency, which, thanks to Edward Snowden, we now know has a considerably larger budget. The civilian one is optimized for surveying within North America; the military one sacrifices domestic precision for global coverage.

When NGS introduced NAD 83, replacing an older datum that dated to 1927, it was the geographic version of the shift from the Julian to the Gregorian calendar. If you’d been paying attention, you would have woken up on December 6, 1988, to find that your house wasn’t at the same latitude and longitude anymore. The shift, as large as 100 meters, reflected a more accurate model of Earth’s shape. Vestiges of the old datum linger. You still see maps based on NAD 27. Also, when the U.S. Navy developed the first satellite navigation system in the 1960s, engineers set the location of 0 degrees longitude by extrapolating the old North American datum. Only later did they discover they had drawn the meridian about 100 meters east of the historic Prime Meridian marker at the Royal Observatory in Greenwich. (Graham Dolan tells the whole, convoluted story on his website, the definitive reference on the meridian.)

NGS and its military opposite number worked together to align their respective datums, but the two systems have drifted apart since then, creating a mismatch between maps and GPS coordinates. Plate tectonics is one reason. WGS 84 is a global standard tied to no one plate. In essence, it is fixed to Earth’s deep interior. Geodesists seeking to disentangle latitude and longitude from the movements of any one particular plate assume that tectonic plates are like interlocking gears—when one moves, all do—and that, if you add up all their rotational rates, they should sum to zero. The effect of not tying coordinates to one plate is that surveyed positions, and the maps built upon them, change over time.

In contrast, NAD 83 sits atop the North American plate like a fishnet laid out on the deck of a boat. As the plate moves, so does the datum. Other regions of the world likewise have their own local datums. That way, drivers can find their way and surveyors can draw their property lines in blissful ignorance of large-scale tectonic and polar motion. “Most surveyors and mapmakers would be happy to live in a world where the plates don’t move,” Smith explains. “We can’t fix that, but we can fix the datum so that the effect is not felt by the predominant number of users.… Generally speaking, a point in Kansas with a certain latitude and longitude this year had that exact same latitude and longitude 10 years ago or 10 years from now.… We try to make the planet non-dynamic.”

To deepen the datum discrepancy, NAD 83 has not been revamped to account for improved knowledge of Earth’s shape and size. “We are currently working with a system that is very self-consistent and very internally precise, but we know, for example, that the (0,0,0) coordinate of NAD 83, which should be the center of the Earth, is off by about two meters,” Smith says. NGS plans an update in 2022, which will shift points on the continent by a meter or more (as shown in the figure at top of this post).

The tradeoff for keeping surveyors happy is that the North American latitude and longitude grid is increasingly out of sync with the rest of the world (as shown in the diagram at left, in which you can see how the North American plate is rotating about a point in the Yucatán). The “rest of the world” includes Southern California, which straddles the North American and Pacific plates. The Pacific plate creeps a couple of inches toward the northwest every year relative to the rest of North America. The plate boundary is not sharp, so the actual amount of movement varies in a complicated way. The California Spatial Reference Center in La Jolla has a network of tracking stations and periodically updates the coordinates of reference points in the state. “That’s what the surveyors then use to tie themselves into NAD 83,” says the center’s director, Yehuda Bock. The last update was in 2011 and another is planned for next year.

Like Smith, Bock says that more frequent updating would actually complicate matters: “Surveyors do not like it if coordinates change, so this is kind of a compromise.” For localized line-drawing, it doesn’t much matter, but large-scale projects such as the California high-speed rail system have to keep up with tectonic motion.

Things obviously get more interesting during earthquakes. “What the earthquake would do is the equivalent of what you do with a pair of scissors, if you cut diagonally across a map along a fault line and then slid one side of the map with respect to the other,” Hudnut says. For instance, in Google Earth, go to the following coordinates north of Palm Springs, near the epicenter of the 1992 Landers quake: 34.189838 degrees, –116.433842 degrees. Bring up the historical imagery, compare the July 1989 and May 1994 images, and you’ll see a lateral shift along the fault that runs from the top left to the bottom right of the frame. The alignment of Aberdeen Road, which crosses the fault, shifts noticeably. The quake displaced the land near the fault by several meters.

GPS networks can even see earthquakes in real time. Here’s a dramatic video of the 2011 Tohoko quake, made by Ronni Grapenthin at U.C. Berkeley based on data from the Japanese Geospatial Information Authority. The coastline near the quake site moved horizontally by as much as 4 meters. The video also shows the waves that rippled outward over Japan (and indeed the world).

Adjustments for tectonic activity take time to filter down to maps. I spoke with Kari Craun, who, as director of the USGS National Geospatial Technical Operations Center near St. Louis., is in charge of producing the USGS topographic maps beloved of outdoors enthusiasts. She says the maps are updated every three years (and even that pace has been hard to maintain with budget cuts). In between, mapmakers figure, the error is swamped by the imprecision of mapping and GPS equipment. Future maps may be updated at a rate closer to real-time. “We have the technology now with GPS to be able to make those slight adjustments on a more frequent basis,” Craun says. As someone who relies on Google Maps to get around, I look forward to that. But the romantic in me prefers seeing out-of-date maps. They never let us forget the dynamism of our planet.

Friday, September 6, 2013

Posted by lelyholida
No comments | 9:32 PM
Rob Carlson drove his Tesla Model S down Route 167 outside Seattle. He accidentally ran over a piece of metal, likely a fender or other curved piece of metal from a truck. That metal somehow punctured the quarter-inch thick armored undercarriage of the vehicle and penetrated its battery pack. Within 30 minutes, the car was in flames—the first fully electric vehicle fire on the road in the U.S. and a viral video sensation.

Fortunately, the Model S comes equipped with a warning system. "The car warned the driver to get off the highway as soon as the incident happened. That's awesome," notes chemist Jeff Chamberlain, deputy director of the Joint Center for Energy Storage Research at Argonne National Laboratory, otherwise known as the U.S. government's battery research hub. "Of course, any $80,000 car should be able to do that for you."

A few weeks later, a driver in Merida, Mexico, lost control of his Model S at high speed, crashing through a concrete wall and into a tree. The driver and his passengers were able to walk away, apparently uninjured before the Tesla burst into flames. And now a third Model S has caught fire after an accident initiated by yet more road debris—this time a renegade tow hitch near Smyrna, Tenn., (which is coincidentally where Nissan builds its all-electric LEAF that uses similar lithium-ion batteries)—that again appears to have pierced the car’s battery pack and set it ablaze on November 6.

Battery fires are not an issue confined to the Tesla Model S, which is by some measures the world's safest car. When doused with seawater after Superstorm Sandy, a fleet of 16 Fisker Karma's burned last October. And a Chevy Volt sitting in a garage weeks after safety testing by the National Highway and Traffic Safety Administration (NHTSA) suddenly began to smolder and burn in 2011, prompting a full safety investigation that later cleared the model for sale. That’s 20 or so incidents in the past few years. For comparison, note that there is a fire in the predominant type of vehicle on the road—a car powered by an internal combustion engine vehicle—every four minutes or so.

Nonetheless, battery cars can burst into flames. This is not a problem confined to cars—think Boeing's Dreamliner or any number of Sony products. Rather, it is a problem confined to batteries. Pack a lot of chemical energy into a small space and if something goes wrong, fire or explosions are the inevitable result. So what should be done in the event of such novel fires?

Novel fires
Lithium-ion is the world's most popular battery technology, employed by the hundreds of millions in everything from cell phones to electric cars. Yet such mishaps have proved extremely rare.

Here's how a lithium ion battery works. A plastic film separates a positive and negative electrode, all of which is bathed in electrolyte, in this case a clear chemical solvent. The electrolyte is a carbonate liquid that ionizes the lithium, causing it to pick up an extra electron. Each of these lithium ions then acts as a shuttle of sorts, carrying that extra electron from the anode to the cathode. At the cathode, the lithium ions are absorbed, freeing up those ionizing electrons to act as current. To recharge the cell, simply add electricity, which drives the lithium back out of the cathode and into the anode, and it’s ready to do it all over again.

Now tightly roll sheets of anode and cathode material and cram it into a cylinder. That's one lithium ion battery. All kinds of things can go wrong in this set up, from a build up of gas that bursts the exterior cylinder to an actual metallic lithium link forming between the anode and cathode that then sets off what engineers call "thermal runaway." It's more commonly known as fire, helped along by the fact that other components of the cell, such as the plastic separator and the organic solvent burn nicely, much like gasoline. "It's a chemical fire at its heart," Chamberlain explains.

Put enough cells next to each other and a defect in one can quickly become a defect in all, thermal runaway on the scale of a car-sized battery pack. These breakdowns of the battery generate their own heat, or, in the words of chemists, the reaction is exothermic—enough so that the heat from one cell can set off another. That's why the software to manage the cooling and recharging of electric vehicle batteries is as important as the lithium ion battery pack itself.

And that's where Tesla has distinguished itself. (Tesla declined to provide someone to comment for this story but referred this reporter to the National Fire Protection Association and the company's online safety video.) The new car company confines each Model S's more than 6,500 lithium ion batteries from Panasonic in 16 individual modules—separate but equal and comprising the vehicle’s overall battery pack. By separating the modules in this way a mishap in one module is unlikely to spread to another module. In addition Tesla's battery pack is cooled with a glycol-based chemical cocktail, blue in appearance, that can quickly whisk away any excess heat. There is also a "firewall" between each module, according to Tesla CEO Elon Musk, suggesting that some kind of heat resistant material is segregating the modules. It seems that just one module burst into flames in the October 1 incident in Washington involving the pierced battery pack. The battery management system worked well enough that the car’s navigation system warned the driver to pull over and get away from the vehicle. He walked away from the accident unharmed.

That is often not true of crashes involving gasoline.

And keep in mind that part of the reason an electric vehicle cannot go as far as a gasoline-burning car is that even the best lithium ion battery only holds roughly 200 watt-hours of energy per kilogram. Gasoline holds 1700 watt-hours per kilogram. Less energy stored means a reduced risk of that energy unleashed. "We are already carrying around really energy dense materials in our vehicles," Chamberlain notes. "We should be comfortable."

Or as Tesla's Musk put it: "For consumers concerned about fire risk, there should be absolutely zero doubt that it is safer to power a car with a battery than a large tank of a highly flammable liquid."

Put out the fire
Once a battery fire gets started, however, that fire can be harder to put out than a gasoline fire. In Washington, firefighters did not help matters by cutting holes into the metal frame and thus allowing more oxygen to reach the battery fire in progress. The best thing to do may be nothing.

"In our controlled laboratory setting, we prefer to let a lithium-ion battery fire burn itself out," says Chris Orendorff, principal investigator for the Battery Abuse Testing Laboratory at Sandia National Laboratories, or the guy whose job includes finding out what it takes to blow up any given battery. That is also the guidance that Tesla gives first responders in its emergency response guide: "Battery fires can take up to 24 hours to fully extinguish. Consider allowing the vehicle to burn while protecting exposures."

If a more aggressive course of action is taken, as also happened in the Washington State wreck, beware of putting too little water on a lithium-ion battery fire. If the amount of water is insufficient, the fire will simply appear to go out—before bursting out anew. And if small amounts of metallic lithium have formed as a result of the lithium-ion failure (lithium-ion batteries, despite the name, typically do not contain metallic lithium), the reactive metal can burst into flames because of something as simple as humidity in the air. In attempting to douse a lithium-ion fire, either a lot of water is required or alternative fire suppressants, like CO2 or other chemicals such as Halon. The National Fire Protection Association notes this as an area requiring more research to determine the best approach.

More worryingly, if left to itself after being damaged in an accident, a lithium-ion battery can slowly degrade to the point where it spontaneously bursts into flames weeks later. In essence, if damaged a lithium-ion battery can produce hydrofluoric acid (one of the most powerful acids on Earth) from fluorinated compounds in the battery that then further damages the cell itself and potentially allows the conditions to become right for a fire. Something similar is what happened in the case of the Chevy Volt that burst into flames weeks after safety testing.

Then there's the toxic vapors: the aforementioned sulfuric acid, plus bits of various metals that can be liberated by the fire, including aluminum, cobalt, copper, lithium, and nickel. Anyone anywhere near such fires, particularly in an enclosed space, should wear full protective gear and self-contained breathing apparatus that allows no outside air into the system. Sulfuric acid is no picnic (although it also finds use in the electrolyte of some lead-acid batteries and is part of the reason that more than 2,000 people suffer chemical burns from using lead-acid batteries, such as the ones in conventional cars, each year.)

Already, more than 100,000 electric cars ply U.S. roadways. Such novel fires will become more common as more and more electric vehicles hit the road, whether the luxury Tesla Model S or cheaper alternatives such as the Ford Fusion Energi. That will in turn mean that the NHTSA and other regulators will need to devise new e-car safety tests, a process that an interagency task force, including the U.S. Department of Energy is currently working to complete.

One reason that Tesla burst into flames in Washington is inherent to the design itself. The battery pack comprises the very bottom of the Model S. This "skateboard" design is what makes the Model S so stable to drive.

But the skateboard design also exposes the battery pack to the hazards of the road. A piece of metal run over by a gasoline-powered car would have torn up the muffler and exhaust system, or possibly punctured a fuel line. In the case of the Tesla Model S, it punctured the battery and set off a fire. The NHTSA will investigate. "It is important to evaluate and understand a cell or battery response to all types of abusive scenarios," Sandia's Orendorff argues, "so that proper design, chemistry or engineering improvements can be made to mitigate these risks."

Or, as Carlson put it in an email to Tesla customer service: "I was thinking this was bound to happen, just not to me. But now it is out there and probably gets a sigh of relief as a test and risk issue—this 'doomsday' event has now been tested, and the design and engineering works." In other words, battery fires are no reason to kill the electric car.

Thursday, September 5, 2013

Posted by lelyholida
No comments | 9:35 PM
The costs of reducing emissions may be flash points along the path toward a 2015 Paris treaty

At a major United Nations climate summit in Warsaw this week, a plan is being hammered out for negotiations on a new climate treaty to be finalized in Paris in two years’ time. Delegates from 195 nations are also seeking to obtain commitments from countries to limit their greenhouse-gas emissions between now and 2020. But the path forward is rife with disputes between rich and poor countries over funding, and how to allocate and enforce emissions reductions.

The conference aims to outline the schedule and to set parameters for negotiations ahead of the next major climate summit in Paris in 2015, when countries hope to forge a treaty to follow the 2009 agreement settled on in Copenhagen.

At that meeting, negotiations over a formal treaty broke down, but eventually resulted in a set of non-binding pledges — the Copenhagen Accord — for emissions reductions until 2020. The accord also blurred the distinction between developed countries, which were bound by the 1997 Kyoto Protocol to reduce emissions, and developing countries, which had no such obligations. Since then, negotiators have worked on how to structure a new framework that would involve climate commitments from all countries — including China, now the world’s largest emitter, and the United States, which never ratified the Kyoto Protocol (E. Diringer Nature 501, 307–309; 2013).

The Warsaw talks are split into two main tracks. One focuses on the architecture of a new global climate treaty that would take effect after 2020, when the current Copenhagen commitments expire. The second examines what can be done to strengthen commitments between now and 2020 to increase the chance of limiting global warming to a target of 2 °C above pre-industrial temperatures (see ‘Emissions up in the air?’).

The European Union (EU), for example, has proposed a multi-stage process, whereby commitments for climate action post-2020 would be registered next year and then subjected to an international assessment to determine how well the commitments measure up against each other and against scientific assessments. The final commitments would then be registered in Paris in 2015. By getting countries to volunteer their climate commitments and comparing them in this way, the hope is that nations with unambitious targets might be shamed into strengthening them. The EU has also called for a review of pre-2020 commitments.

Tasneem Essop, who is tracking negotiations for the environmental group WWF in Cape Town, South Africa, says that these short-term commitments are crucial for pointing the world in the right direction. “The biggest challenge will be to ensure that emissions do peak within this decade,” she says.

The cost of reducing emissions could be the first flashpoint in Warsaw. In Copenhagen, developed countries agreed to provide US$30 billion in climate aid from 2010 to 2012, and to increase climate support to developing countries to $100 billion annually by 2020. Although the short-term commitments were largely met, there is no clear plan for attaining the goal of $100 billion a year. From emerging giants such as Brazil and China to poor countries in Africa, developing nations are demanding that wealthy countries ramp up funding and create a viable path to this goal.

With public coffers strapped, many developed nations are looking for other funding sources. One possibility is to place some type of levy on international aviation, which is being considered by the International Civil Aviation Organization in Quebec, Canada. The body has committed to craft an agreement by 2016 that could take effect by 2020.

Negotiators in Warsaw will haggle over how to finance and ultimately deploy climate aid through organizations such as the newly launched Green Climate Fund, based in Incheon, South Korea. Another flashpoint is the developing countries’ demand for a ‘loss and damage’ mechanism to compensate poor countries irreparably harmed by climate change.

But the biggest questions will center on the framework for the treaty in 2015. Before Copenhagen, the emphasis was on a treaty similar to the Kyoto Protocol that would lock in legally binding emissions reductions. In Copenhagen, the United States and other developed countries pushed for an alternative that would allow individual countries to register commitments, which would then be reviewed at an international level. Delia Villagrasa, a senior adviser for the European Climate Foundation in Brussels, says that the talks are moving towards this bottom-up approach, which would be combined with a formal review to assess commitments and identify ways to scale them up. The world could get its first hint of what such a system might look like as the talks wrap up next week.


“Warsaw will bring some clarification on the structure of the new agreement,” Villagrasa says. “That’s not sexy for the media, but it’s important.”

Blogroll

About