The Perspective 
Tuesday, 20 January 2015

David McCracken
Livewire Digital

Pizza Hut is probably the last place in the world I’d expect to be wowed by technology. Don’t get me wrong, I love sinking my choppers into a stuffed crust as much as the next guy, but there’s not much about the restaurant chain that screams “on the cutting edge of tech.” But I can admit when I’m wrong, and boy was I wrong about this. (Watch this video and you’ll see why.)

The interactive tabletops Pizza Hut is introducing are not only fun and different, they virtually eliminate the major pain points of eating at a restaurant. They remove the annoyance of having a slow server, so you can order your food the instant you’re ready. They also take human error out of the ordering process. No matter how complicated your order, the self-service process nearly ensures it will arrive at your table correctly.

What makes the touch screens so effective is their ability to help customers visualize and customize exactly what they’re ordering. But pizza is not the only place you’re going to start seeing this new technology. Look at all the applications that are going to be happening in other industries:

1. Retail

Soon touch screens will be used like a virtual dressing room in clothing stores. Customers select a model who has a similar body type to theirs and swipe to try out different combinations of clothing styles, sizes, and colors. Then they select the favorite items, head to the register, and a sales associate meets them with their purchases, ready to check out.

2. Hospitality

Soon desks in hotel rooms will double as interactive, self-service concierges. Guests can order room service, wake-up calls, laundry services, and more. They can also browse through information on different restaurants and attractions in the surrounding area and easily make reservations straight through the touch screen software. It’s easy to see how advertising will play a key part in this set up.

3. Tourism

We’re already seeing travel centers using these types of touch screens as giant interactive brochures. Tourists touch and drag on activities, attractions, and hotels from a map onto the calendar to plan out their travel schedules. Since they can be updated in real time, these types of screens often include travel times and prices to give tourists a comprehensive overview of vacations options.

The possibilities are endless with this incredible self-service software. Here’s a restaurant that’s using these tabletops to educate diners about the origins and characteristics of different foods. What creative uses can you see for this technology?

Posted by: Admin AT 09:44 am   |  Permalink   |  
Monday, 06 July 2009
Banking machine applications that involve paper transport include check processing, billing and currency handling. All this equipment, particularly ATMs, requires precise movement and paper capture at high speeds and volumes, and the machines must retain these capabilities under hostile climatic conditions. Extreme heat and humidity, as well as extreme cold and dry conditions, create changes in electrostatic levels and friction in paper-transport equipment, which can cause jams and double-feeds. For the engineers who select rollers, belts, and pads for ATMs, consistent conductivity and levels of friction are vital considerations in both original and replacement parts.
Albert C. Chiang
High-performance urethane is ideal for these applications. It has clear advantages over traditional materials like rubber or silicone for rollers, wheels, belts, and pads. Urethane provides excellent shock and vibration dampening, abrasion and wear resistance, great durability and high mechanical strength.
More important to ATM manufacturers and replacement-part suppliers, it offers customizable levels of conductivity and of coefficient of friction, both of which greatly increase equipment reliability and performance and eliminate the need for overly tight tolerances in design. 
Controlling conductivity
The most commonly practiced method of making urethanes semi-conductive is by “doping”: adding select amounts of conductive materials to the mix during part manufacture. While this does adjust the electrostatic properties of the part, each additive comes with its own drawbacks.
Adding carbon controls conductivity, but late in the life of the part, the carbon can stain the paper it is transferring. High-voltage vacuum coating fiber offers a limited two-dimensional (surface) conductivity control, but it is expensive. Conductive metal dispersion using powdered metal in blends from 10 percent to 80 percent controls electrostatic properties fairly well, but it is difficult to achieve uniform distribution of the powder during parts manufacture, since the metal tends to settle.
Urethane belt manufactured by MPC. The urethane can be customized for electrostatic properties, coefficient of friction, and hardness to suit all paper-handling tasks.
Ammonium salt additives address static, but their conductive properties vary considerably with humidity — and at high humidity, they make the surface of the part sticky.
The molecular method
The best way to customize conductivity is to modify the urethane structure at the molecular level, making the material itself semiconductive instead of relying on additives.
Urethane parts customized in this way have a volume resistivity of 5E5 to 5E10 at a hardness of 5 Shore A -80 Shore D for solid urethanes, and 20 Shore 00 to 90 Shore A for foam. The same method can be used to create parts that are completely antistatic. Combined with a customized and constant coefficient of friction, the electrostatic properties of these parts ensure precise movement of paper at high speeds, eliminating double-feeds, feed failures, or jamming, which errors in conveying checks or currency and require costly machine service.
Getting a grip on paper transport
Urethane gear manufactured by MPC. The urethane can be customized for electrostatic properties, coefficient of friction, and hardness to suit all paper-handling tasks.
Customization of urethane can be useful above the molecular level, where controlling the grip of a belt or a roller is critical — especially for high-speed paper transport systems such as ATMs. Chemically modifying the urethane molecules during manufacture can increase the part’s coefficient of friction, making it greater than rubber or silicone parts of similar hardness.
The proper coefficient of friction enables a part to grab one piece of paper at a time, move it a precise distance, and let go at the exact right moment, throughout the long life of the part. Such chemically based performance is achieved through decades of experience and fine-tuning of the formulations and manufacturing processes.
Semiconductive urethane parts for paper-transport tasks create the best balance of precision static control, coefficient of friction, abrasion resistance and toughness to ensure long-lasting, jam-free performance in ATM equipment. The result for the ATM manufacturer and service supplier is increased equipment accuracy and precision, reduced downtime and need for service calls and replacement parts and greater customer satisfaction.
Posted by: Albert C. Chiang AT 12:53 pm   |  Permalink   |  0 Comments  |  
Tuesday, 12 August 2008

Has this ever happened to you? 
You walk into a store or a business and see on a display counter or lobby desk a dilapidated PC sitting forlornly, unused and unappreciated, only to realize the poor PC is intended to provide self-service? 

Or perhaps instead it is a beautifully branded specialty kiosk, but with a screen that is either dead, showing the Windows desktop or — having been hacked — is displaying something entirely inappropriate?  James_Kruper.jpg

Both are classic self-service failures, and I always wonder what circumstances transpired to cause these outcomes. Deep down inside, I also wonder whether the outcome would have been different had I been involved in the project.

I like to think the answer is always a resounding "Yes!," but sometimes it isn't that simple. 
What is clear is that failed self-service projects are, by definition, very public and highly visible disasters for our industry. Everyone who comes into that store or business sees the failure. The self-service market is growing rapidly and is remarkably well-accepted by the general public, so I don't believe the occasional project failure is overly threatening to the health of the industry. However, I do believe the industry could be even further along its penetration curve if there were more successful projects.

The three-legged stool
There are many ways that a project can fail. Perhaps the most obvious is a poorly designed application. However, I will address two others: Hardware and kiosk system software.

When describing a successful project to a new deployer, I often use the analogy of a three-legged stool. The three legs are the software application, the hardware and the kiosk system software.

Without all three legs, the stool will topple.

The software application leg is almost always present because this leg typically is what drove creation of the project in the first place. Without an application, there is no project. It may be a poorly designed application, but at least it exists.

The hardware leg is the physical manifestation of the project and generally takes the form of standard kiosk-industry components and perhaps a company-branded OEM enclosure. Sometimes, but not very often, a standard PC with perhaps a touchscreen display is appropriate. Due to the capital-intensive nature of kiosk hardware, this leg too often is compromised. This can cause the kiosk to be used less than it theoretically could be, or in the worst case, be totally unuseable.

The kiosk system software is the most likely leg to be missing because it generally is invisible to the user, so deployers either think the software is unnecessary or aren't aware it exists.

What is kiosk system software? It is a software layer that locks down the computer and prevents users from going places they shouldn't. It can perform usage logging, which reports how often the application is being used, and remote monitoring, which checks that the kiosk is always up and running without error. In short, it is what ensures the software application is always available to the user.

If the deployer develops the project from the perspective of the user, he may miss the need for system software entirely. Or, a deployer may decide he can develop a partial homegrown solution such as using OS system policies to lock down the computer. Sometimes, the software application is developed to also provide kiosk system software functionality; however, that requires very specialized knowledge and experience, and with many self-service applications being browser-based, it is very difficult to combine the two software legs of the stool.

Project on the rebound
We get many desperate phone calls after a project has been deployed. 

Generally, either the hardware leg or the kiosk system software leg of the stool is missing. Sometimes, both legs are missing. Hardware vendors know that one of two outcomes will occur when the hardware leg is missing: The deployer eventually will find the money to invest in hardware, or the project will be cancelled as a failure. The former is not ideal, but at least the proper hardware eventually is purchased. The latter is the true tragedy because the opportunity has been lost and how long will it be before that organization tries again? 

Because the kiosk system software leg is not as capital intensive as hardware, generally as soon as the light bulb turns on in the deployer's head, the problem is resolved.

Unfortunately, sometimes the light bulb never turns on.

A call to action
What can be done? Education will go a long way. Both hardware and software vendors need to ensure that their clients fully understand the need for all three legs of the stool: Application, hardware and system software. I know that I am guilty of occasionally focusing too much on the software aspect of a project, but it isn't done purposely. Often a deployer will contact me, but they are talking to me for my software expertise. They don't necessarily want to talk to me about hardware, and sometimes that is where the discussion ends. 

As an industry, we need to concentrate on keeping the discussion going.

James Kruper is the president of Analytical Design Solutions Inc., one of the association's vendor members. Check out his previous column, which was published in April: Self-service and digital signage can co-exist.

Posted by: James Kruper AT 11:49 am   |  Permalink   |  0 Comments  |  
Monday, 24 September 2007
Lief Larson is a media technologist residing in Minneapolis. He is also the founder and former editor-in-chief of Kiosk magazine, the forerunner of Self-Service World magazine.
The concept of “self-service 2.0” is a forward-thinking vision for the possible interactive and visual displays of tomorrow, based on technology that exists today.
Kiosk and self-service technology is evolving. Over the next 12 months to 24 months the industry likely will experience a turning point for new technologies and applications that add a new level of convenience and accessibility to people’s lives — think: thin client, RFID, kiosk/digital display convergence, data transport, customer sensing and other tools.
Here’s a look at where the technology may be headed:
For years, those in the kiosk realm have predicted the advent of thin client/small form factor devices, yet boxy terminals continue their prevalence. A peek around the corner may see software that primarily resided on kiosk hardware slowly disappearing in favor of Web-based, software-as-service platforms. In this scenario, the terminal only needs to be able to connect to the Web and to a limited number of local software applications, such as drivers for hardware add-ons like credit card readers and printers.
The advantages of thin client are lower hardware costs and less chance for downtime — fewer parts mean less probability of failure. What hardware remains in the kiosk enclosure will be small and stable. Look for ultra-compact embedded hardware such as mini/pico-ITX and panel PCs in 17 inches to 22 inches for under $999 to blaze trails in this area.
The kiosk terminal (interactive) and digital signage (non-interactive) will combine. Previously at a kiosk terminal users would navigate and control it with interface tools, whereas a digital sign simply would display information without any user control or interactivity. An emerging trend in digital displays allows users to immerse themselves in the experience by interacting with digital signage through user triggers such as multitouch screens, cell phones and RFID.
Kiosk and digital signage company Nanonation, for example, has provided cutting-edge applications for Royal Caribbean Cruise Lines and Umpqua Bank.
Brian Ardinger, Nanonation’s senior vice president of marketing, said the key is to not interrupt the customer experience, but to enhance the environment with information, images and intuitive systems. “By giving customers access to the information and marketing tools necessary when they ask for it or when they are naturally interacting with something is more powerful than hoping a person sees the right loop at the right time.”
Self-Service 2.0 Applications
• At self-service kiosks, concert-goers use USB sticks to buy, download and take home live recordings of the event after the show. Although this technology has been around for years, hundreds of thousands of people now carry USB sticks; they have become the single most popular means of physically transporting data files. The music group Barenaked Ladies have been selling DRM-free concert music for USB sticks. Thumb drives and mobile media memory sticks soon also may allow customers to take information away from environments such as retail stores.
• Sequoia Media Group launched myMovieMaker, a service now available at Wal-Mart stores that instantly transforms consumer digital photos into personalized DVD movies. The service uses Hollywood-style effects and themed storyboards to empower customers with professional movie production. It is likely that soon we will see advanced kiosk functionalities that address marketplace trends, such as developing video for sites like YouTube and customizing large-ticket purchases such as cars and furniture.
• French video rental giant CPFK has developed the Moovyplay system to rent movies for 30 days using a portable hard drive. A customer loads up the drive at an in-store kiosk, then plugs it into a docking station connected to his TV set. The first-generation drives, which are about the size of a BlackBerry device, can store up to 14GB of data, enough to hold about 40 movies at DVD quality. Users pay for the movies with a prepaid card, and revenue is split between the studios and the retailer. Household pipelines are bandwidth-limited for high-definition file sizes, which may present opportunities for kiosks to help put file transfers into the hands of consumers through large file transport devices. Is anyone else scared by the word “teraflop”?
• Freedom Shopping is a state-of-the-art self-checkout retail kiosk that uses RFID technology for speedy checkouts. There is no learning curve for the shopper — simply approach the kiosk with the items to check-out and a helpful voice guides the shopper through an effortless transaction. We have just reached the true opportunity of RFID in self-service. Expect dozens of new test applications that work through RFID to roll out over the next 18 months.
• LocaModa is developing technology that works with self-service kiosks and other out-of-home networks such as Wi-Fi hotspots, narrowcast digital signage and IP-based entertainment networks (from jukeboxes to cinemas) that can be leveraged to provide interactivity, presence and commerce for mobile consumers. The next generation self-service movement calls for cell phones and connected PDAs to act as remote controls for calling up information.
• Rogers Wireless, a leading telecom communications company in Canada, is using in-store kiosks that act as an extension of the sales force, educating and captivating customers and increasing potential sales opportunities. The interactive touchscreens, working in tandem with motion-sensing technology, inform consumers about specific products of interest to them. Whether trying to determine when a customer is near your self-service terminal or digital display, or triggering applications, sensing technology offers a huge chance to intuitively engage customers at the moment of opportunity.
Posted by: Lief Larson AT 11:53 am   |  Permalink   |  0 Comments  |  
Monday, 17 September 2007
Bill Gerba, president of WireSpring Technologies, regularly blogs about digital signage at The following column first appeared on that site here.
By now, you've probably heard about last year's massive security breach at TJX (the parent company of TJ Maxx, Marshalls and a few other), which resulted in the theft of millions of credit card numbers and other pieces of personally identifiable information. As the different versions of the story have come and gone, the culprits were either hackers sitting in a nearby parking lot who infiltrated an unsecured wireless network, fake kiosk repair men who installed phony keypads to steal credit card numbers and PIN codes, or ex-employees who had access to key records and resources. But a new twist covered by Information Week and StorefrontBacktalk suggests that problems with TJX's in-store security practices (or lack thereof) allowed the attackers to use job application kiosks as a vector into the corporate network. Regardless of what the actual method of attack turns out to be, you never want to leave those doors open. And since virtually every digital signage and kiosk network relies on having networked devices somewhere in the store, now seems like a good time to review some dos and dont's for in-store computer security.

Depending on which version of the TJX kiosk story that you believe, hackers either replaced an encrypted PIN pad, inserted hardware keystroke loggers, used USB key drives to inject malicious software, or some combination of the three. This brings to mind a couple of guidelines that should always be remembered when placing computers in places where unauthorized people can get to them:
Lock 'em down. If you're putting a self-service kiosk on the sales floor and expect your customers to interact with it, you'd better be sure that any cables are securely fastened, unused ports are closed off (both physically and in software), and any access doors or panels are secured with a key or combination lock. In one version of the TJX story, phony tech staff physically tinkered with the kiosks, but in every version it should not have been physically possible to even install the device (USB key drive, fake PIN pad or keystroke logger). To prevent this, secure and cover all cables and openings. Even better, use an all-in-one appliance like IBM's Anyplace Kiosk with an on-screen keyboard for data entry. This eliminates the need for most external peripherals, and the ports seal up nicely, too.
Out of sight, out of mind. Taking item #1 a step further, if you don't need to have your computers sitting out where anybody can get at them, lock them up somewhere else. For a kiosk application, that might mean putting the CPU in a locked cabinet or closet (though the IBM Anyplace Kiosk obviates the need for this, provided you've bolted the thing down, of course). For digital signage applications, make sure your players are either sitting in a locked enclosure if they're kept behind each screen, or even better, put all of the media players in a secure room or closet, and use video distribution equipment to carry the signal to screens elsewhere in the store. One quick anecdote here: not too long ago we won a digital signage deal away from a competitor who, in addition to not having the best product for the customer's needs, also used laptops as the media players driving each screen. Unsecured laptops. Laptops that were simply cable-tied to a mounting bracket behind each screen. Let's just say that after a month-long trial period, many of the customer's "media players" had mysteriously gone missing.
Batten down the hatches. Visa, MasterCard, and other payment groups started catching flack for a lot of the more serious retailer data breaches a few years ago, and they responded with a new program called the PCI DSS (Payment Card Industry Data Security Standard). This applies to retailers as well as other parties, and outlines specific guidelines for handling cardholder data. For POS software and other payment-oriented applications, a special certification called PABP (Payment Application Best Practices) applies. Getting certified for PABP is an expensive and time-consuming endeavor. However, PABP certification is absolutely essential for kiosks that use credit cards for payment or identity verification, and it's also a very good idea for any computer-like device or service that comes within striking distance of a retailer's payment processing and data storage systems. Installing a spiffy new kiosk platform, or maybe a digital media network? Find out from your vendor if their software is up to snuff. Remember, even if your device doesn't actually accept credit cards, it could still be used as an attack vector to get to POS systems or other devices on the store's network that do house this data. Taking a point from the TJX story, it's also a good idea to disable any unused ports and peripherals in the computer's operating system and password-protect the BIOS, which further reduces the risk of tampering.
Don't forget to lock the gate! I think the most amazing and hard-to-believe version of the story came from Information Week, who suggested that USB key drives were used to install rogue programs on the kiosks. (What? The kiosk software allowed new programs to be installed?) This gave the attackers unfettered access to TJX's corporate network, as the kiosks were not separated from the rest of the network by a firewall! If this was 1991 and the Internet was still a cool toy for academics and scientists I might have let that slide. But seriously, this is 2007 and the attack in question happened quite recently. Whether you're using kiosks or not, anybody who doesn't believe that an extra Ethernet jack in the wall is a potential attack vector is deluding himself: important data should always be protected with a firewall. Forget about locking the gate. If this story is true, TJX's IT staff didn't even bother installing it.
This story just goes to show that no matter how many best practices guidelines and review meetings an organization has, it's all worthless without proper execution. While TJX only expects to take a modest financial hit from this breach (the $17 billion-a-year retailer is allocating less than $200M to cover all of the damages), a lot of customers and other businesses are upset over the exposure of their personal information. Worse, it stands to reason that there are other retailers out there with similar security practices, which are in desperate need of review and updating. And while security is certainly becoming an ever more important part of an IT staff's job, the proliferation of in-store computers for self-service kiosks, digital signage, Bluetooth/SMS beaconing, traffic monitoring, and security applications suggests that the problem will continue to grow.

There is some good news, though. All of the involved parties -- retailers, vendors and consumers -- have a vested interest in seeing things improve. Vendors must continue to improve their products, designing new systems and updating existing ones to make security features a high-priority. Likewise, retailers need to make sure that security plays a significant role in their policies and practices, taking advantage of new vendor-supplied solutions as they become practical and verifying that any new hardware and software purchases are compliant with the latest security mandates and standards (like PCI and PABP). And customers (that's all of us) have the most important job of all: telling retailers and vendors exactly how we feel when they slip up.
Posted by: Bill Gerba AT 11:56 am   |  Permalink   |  
Monday, 02 July 2007
Editor’s note: The writer is executive vice president of BroadSign International.
The two operating systems most widely used for digital signage networks are Windows and Linux. The purpose of this article is to help you get an idea of the advantages and the total cost of ownership of both in order to make an informed decision about choosing an OS for your digital signage network.
Many startup network operators, trying to minimize their upfront investment burden, choose Linux because they don’t have to pay for the license. Some of them discover later that the resulting cost of running Linux is not necessarily lower than that of using Windows.

When is Linux the right choice for your network? What are the real strengths and weaknesses of each OS?
In terms of functionality, the two operating systems are essentially equal. The common perception, however, is that Linux is "free", or cheaper to operate than Windows. Here are some thoughts on the subject that we have summarized based on our field experience and discussions with clients.
Microsoft Owns Windows. Who Owns Linux?
Microsoft clearly dominates the market. A new Windows version appears every 2-5 years. You do have to pay for a Windows license and the fee depends on the version, i.e., XP Professional, XP Embedded, Server 2003 or Vista.

Linux, on the other hand, has no central owner and most of its components are free (as in: freedom to copy, modify, and distribute) software. This allows many amateur enthusiasts, technology companies, and non-profit organizations to extend and publish their own Linux distributions. Some of the most well-known distributions are Fedora Core/Red Hat, SuSe/Novell, and Debian/Ubuntu.
Each one of the major Linux distributions has a free version, however in this case support is usually limited to what the community can provide through online forums, and mailing lists. Some providers offer special enterprise versions for a fee that covers direct support from the vendor, and others offer support as an on-demand service.
Is Linux Truly Free?
The zero cost of licensing for free Linux distributions is often the most attractive aspect of this operating system choice, especially for large digital signage networks. But this may be misleading, as licensing fee is only part of the overall expenses that you have to project.
The most commonly overlooked cost related to deploying a network of Linux players is that of hardware support. In fact, hardware support service providers are the market to which enterprise Linux distributors are catering. All enterprise versions come with a list of officially supported hardware, and a support schedule (e.g. forward compatible for five years). This is similar to what Microsoft has been doing with new releases of Windows for well over a decade.
Such support is not usually available with the free versions, though notably the Ubuntu Linux distribution comes with a guarantee of forward compatibility for five years. This is why we at BroadSign have standardized our Linux offerings around this distribution.
However, in order to fix problems and bugs you may encounter in a free Linux distribution, you have to either wait until they are resolved by the open-source community, or invest (sometimes a lot) into custom development.
Standardizing on a Playback PC Configuration
Windows versus Linux has been one of the longest-standing debates among IT specialists. They argue about which system is more stable, more secure, or yields higher performance. We have discovered that there are no inherent overwhelming advantages in any of the two operating systems. The stability, security and efficiency of a system really depend on which environment your IT team is more proficient in: Linux or Windows.
One of the grave mistakes is to select a hardware platform and an operating system separately. Another one is not to test the selected hardware/OS combination for performance and endurance.
Most device drivers have been tuned and tested for standard desktop uses, not for usage in an appliance-style configuration. It is therefore important to test many configurations before standardizing on a playback PC.
If Content is King, then the Operating System is the Kingdom
Now in terms of content playback functionality, Windows Media is a major factor to consider when choosing between Windows and Linux for digital signage. The vast majority of digital signage software packages use Windows Media Player as their playback engine. This has the benefit of leveraging any Windows Media-specific hardware acceleration, and being able to play all the media types that are supported by Window Media Player.
There are downsides to Windows Media Player as well. The most obvious one is it only runs on Windows. One that is less noticeable is that it does not come standard with MPEG-2 and MPEG-4, which means that you must acquire licenses for these codecs (media formats) from an independent vendor, unless you plan on only using WMV and MPEG-1. These, among other reasons, are why here at BroadSign we developed our own playback engine that is independent of Windows Media Player. All the codecs we support (MPEG-1, MPEG-2, MPEG-4) come included with our software, and are the industry leading standard formats. If you need to play Windows Media Video, we can run this in our proprietary player on Windows as well, because in this specific case we will use Windows Media Player.
In all other cases BroadSign Player can run both on Windows and on Linux platforms.
Total Cost of Hardware Ownership
One of the biggest advantages of the Windows operating system is that it supports all the newest hardware. Because of market pressures, hardware manufacturers always develop device drivers for Windows. While some provide them for Linux, not all of many different distributions of Linux are covered.
Driver support in Linux is not really a problem for PC components that are in widespread usage, or that are a little older. The DIY-and-share philosophy of the Linux community and increased investments by corporations integrating Linux into their enterprise means that somebody in the world will eventually fix the problem and everyone will benefit. The shortcoming here is that you may have to wait until you get the required driver.
Another issue for hardware on Linux is that some hardware components are developed exclusively for Windows. An example of this is the WinModem (aka SoftModem). This modem is much less expensive than a hardware modem because it replaced the DSP chip with DSP software that runs on the PC’s CPU; software that is written by the vendor exclusively for Windows. This trend seems to be spreading into the video card market, as market leaders like Nvidia and ATI are developing extensions specifically for Windows Media.
The above factors may increase the total cost of hardware ownership for Linux users.
From a digital signage operations perspective, there are more things in common between Linux and Windows than there are differences. What is important is that you select your hardware in conjunction with your operating system and digital signage software. While Windows has an upfront licensing cost, its costs are fixed and predictable. Linux has the potential of a lower total cost of ownership, but much investment must go into the expertise for selecting the hardware platform, otherwise costs can balloon out of control.
In the end, regardless of the operating system you select, the most important determinant of the total cost of ownership is the competence of the team behind selecting and configuring your playback platform, as well as that of the support team.
There is no magic bullet that will let you dramatically cut costs if you choose Linux. If you don’t already have a Linux-savvy IT department, any cost saving on the license fee will backfire with the increased cost of training or hiring qualified people.
Posted by: Brian Dusho AT 12:45 pm   |  Permalink   |  0 Comments  |  
Monday, 18 September 2006
After attending The Self Service & Kiosk Show in Orlando, I can’t just walk past a kiosk anymore. I look at them. And I look behind them. And I wonder what is used to ensure it’s working when needed.
When the ATM didn’t work at my bank after a thunderstorm, I wondered: what had they done to ensure customers could use the ATM? It was Sunday and the storm was on Saturday evening. Had it been down that long? 
Last weekend, after enjoying a brunch at a local resort, I had to stop at the kiosk in the lobby that was deigned to provide information on local destinations. The screen was black, but did flicker when I touched it or moved the mouse. I wondered what might have caused it to not work or if measures were taken to ensure it would work when someone who really needed it came by.
It’s hard to imagine a device any more out on an island than a kiosk. Maximum uptime is critical in order to fulfill its designed purpose of self-service, and yet they are usually unsupervised. 
Quality power is the lifeblood of any electronic system and is vital to achieve the desired performance and uptime. Still, power issues remain a mystery in many industries, even – in many cases – with seasoned technical personnel. The impact on microprocessor-based products have come more to the forefront as technology is designed to deliver more functionality at higher speeds.
To begin understanding and evaluating power issues, it is important to understand the three basic levels of impact on a system: disruptive, degrading and destructive. It is important to know that these levels can occur on any conductor, whether it’s a power source, network cables or phone line when connected to your system.
Disruptive power disturbances cause over 80% of the issues you will encounter, according to numerous power studies. According to Florida Power & Light, over 60% of the various power issues are created inside of a facility from a variety of sources. The sources can include elevators, heating and cooling systems, all the way down to the most fundamental equipment plugged into the internal power grid. Usually disruptive events manifest themselves in unexplained system lockups and result in service calls without the owner finding the cause.
Degrading power disturbances contain enough energy to microscopically erode an integrated circuit and its components. One description of degradation is weakening components, much like rust attacks metal. Degradation will lead to premature component failure if not taken seriously. It may be headed off by paying close attention to disruptive events.
Destructive power disturbances occur when an electronic device is overwhelmed by a large-amplitude, high-energy power event. Typically, lightning and thunderstorms are the culprits. Additionally, over voltage can occur from storms that cause power line damage and construction accidents. In one case a freight carrier backed into lines attached to a building and the over voltage passed to everything inside.
I encourage all companies deploying microprocessor-based products to take the time to understand power issues, recognize symptoms and know what technology will protect and condition. Your uptime may depend on power conditioning, which addresses multiple issues. You may not know which power problem you need to address or when, but using proactive solutions to prevent power problems will make keeping your kiosk on an island less of a frustration and more of a vacation.
For more information, including a collection of online articles, e-mail Dana Davis at .
Dana L. Davis is the National Sales Manager at Smart Power Systems.
Posted by: Dana Davis AT 02:24 pm   |  Permalink   |  0 Comments  |  
Add to favorites

Our members are among the most prominent and respected suppliers of digital signage, kiosk, self-service and mobile technology solutions.

Request project help from DSA members

 The Perspective 
Latest Posts

Janet Webster, Creative Solutions Consulting

"Being a member of DSA is extremely beneficial. It's a great organization that helps its members to achieve their goals."

Janet Webster
Creative Solutions Consulting

Tweets by @iDigScreenmedia

Digital Screenmedia Association | 13100 Eastpoint Park Blvd. Louisville, KY 40223 | Phone: 502-489-3915 | Fax: 502-241-2795



Website managed by Networld Media Group