Chapter 1 | Foundations of Information Systems in Business
The Fundamental Roles of IS in Business
Support of Business Processes and Operations . As a consumer, you regularly encounter information systems that support the business processes and operations at the many retail stores where you shop. For example, most retail stores now use computer-based information systems to help their employees record customer purchases, keep track of inventory, pay employees, buy new merchandise, and evaluate sales trends. Store operations would grind to a halt without the support of such information systems.
Support of Business Decision Making . Information systems also help store managers and other business professionals make better decisions. For example, decisions about what lines of merchandise need to be added or discontinued and what kind of investments they require are typically made after an analysis provided by computer-based information systems. This function not only supports the decision making of store managers, buyers, and others, but also helps them look for ways to gain an advantage over other retailers in the competition for customers.
Support of Strategies for Competitive Advantage . Gaining a strategic advantage over competitors requires the innovative application of information technologies. For example, store management might make a decision to install touch-screen kiosks in all stores, with links to the e-commerce Web site for online shopping. This offering might attract new customers and build customer loyalty because of the ease of shopping and buying merchandise provided by such information systems. Thus, strategic information systems can help provide products and services that give a business a comparative advantage over its competitors.
Welch’s: Balancing Truckloads with Business Intelligence Given dramatic fluctuations in gas prices, it’s no surprise that companies want to find ways to rein in transportation costs. One company finding success in that endeavor is Welch’s, a well-known purveyor of food and packaged consumer goods. The company is tapping the power of business intelligence for better insight into its supply-chain operations, which in turn can help keep transportation expenses lower. Welch’s, the $654 million manufacturer known for its jams, jellies, and juices, recently installed an on-demand BI application from Oco. One way Welch’s is leveraging the Oco BI application is to ensure that truckloads delivered by its carriers go out full. The idea is that customers are already paying for the full truck when it delivers goods, even if it’s only halfway or three-quarters loaded. With the BI system, Welch’s can tell if a buyer’s shipment is coming up short of full capacity and help them figure out what else they can order to max it out, thus saving on future shipping costs. “Welch’s can go to the customer and say, ‘You’re only ordering this much. Why not round out the load with other things you need? It will be a lot cheaper for you,’” says Bill Copacino, president and CEO of Oco. “If you’re able to put 4,000 more pounds on the 36,000-pound shipment, you’re getting a 10 percent discount on transportation costs,” he adds. “We’re essentially capturing every element—from the customer orders we receive, to bills of lading on every shipment we make, as well as every data element on every freight bill we pay,” says Bill Coyne, director of purchasing and logistics for Welch’s. “We dump them all into one data warehouse [maintained by Oco], and we can mix-and-match and slice-and-dice any way we want.” Coyne says that Welch’s tries to ship its products out of its distribution center five days a week. “But we found ourselves just totally overwhelmed on Fridays,” he says. “We would complain, ‘How come there are so many orders on Friday?’ ” Now, the new system helps Welch’s balance its daily deliveries so that it uses about the same number of trucks, rather than hiring seven trucks on a Monday, five on a Tuesday, eight on a Wednesday, and so forth. The company reaps transportation savings by using a stable number of trucks daily—“as capacity is not jumping all over the place,” Copacino says. “We are gaining greater visibility into cost-savings opportunities, which is especially important in light of rising fuel and transportation costs,” says Coyne. Welch’s spends more than $50 million each year on transportation expenses, and the Oco BI application and reporting features have become critical in a very short period of time. “We literally can’t go any amount of time without knowing this stuff,” Coyne says.
Responsibility and Accountability for Project Success (and Failure) Your department—information technology—has just played a starring role in blowing a multimillion-dollar enterprise software project. The intense glare from the CEO, CFO, and other business leaders is squarely focused on the CIO, vice president of applications, project managers, and business analysts charged with making sure that this didn’t happen. Of course, IT is never 100 percent at fault for any massive project—whether an ERP or CRM implementation, mainframe migration, or networking upgrade. The business side usually plays its part. But the unfortunate and unfair fact is that because these initiatives are considered “technology projects,” the business will almost always look in IT’s direction when there’s blame to be tossed around. “That’s just a fact of life in
IT,” says Chris Curran, who’s both a consulting partner at Diamond Management & Technology Consultants and its Chief Technology Offer. No sane executive would dismiss the strategic importance of IT today. And most don’t: An IT Governance Institute study, consisting of more than 250 interviews with executives of both large and small companies in a variety of industry sectors, found that half of the respondents said that IT is “very important to the enterprise,” and three-quarters stated that they align IT and business strategies. When it came to IT project accountability, “executive management” was identified as the group held accountable for IT governance in 71 percent of the enterprises. That’s all well and good, but when it comes to walking the walk with technology projects, non-IT executives appear to fall back on familiar rhetoric. In a similar 2009 survey of more than 500 IT professionals by ISACA, a nonprofit trade group focusing on corporate governance, almost half of respondents said “the CIO is responsible for ensuring that stakeholder returns on IT-related investments are optimized,” notes the survey report. Curran takes those results a step further. “Business investments need to have business accountability,” Curran says. “But when a project goes south, especially high-profile ERP implementations, IT gets blamed—but it’s not an IT project.” Curran’s advice for such massive undertakings, which CIOs and analysts talk up but many don’t follow, is practical: Think bite-sized project chunks and set proper expectations. He also advises his clients and their IT shops to embrace change and transparency—even if it hurts at first. “The corporate culture—the status quo—tends to be: ‘Everything’s good. We don’t talk about problems until they are near unrecoverable, because we know people don’t like bad news,’” Curran says. But there are always going to be problems. That, also, is “just a fact of life in IT.”
Modernize (Don’t Replace!) Your Legacy Applications Over time, all companies find themselves facing outdated legacy systems— those technologies that were developed long ago and that, while still working today, are lacking in one or more major aspects. The interface may look strange (monochromatic green letters over a black background!), business needs may have changed, and there are new functions that the system can no longer perform. Documentation is often missing or outdated, and the original developers are long gone. Current IT staff are not entirely sure how the old system works.
Where to go from here? Many companies would undertake a massive system development effort at this point. They would obtain requirements for a new set of applications, which would be designed and either coded or acquired, then tested and implemented. And then would come training, data conversion, implementation rollouts, and the like. And the initial release would likely not work as well as the “old” system it is replacing, although things would get better as time (and money) goes into it. This is not, however, the only way. There are tools (which used to be called “screen scrappers”) by vendors such as IBM, Attachmate, and Rocket Seagull Software that put a web-browser front-end between the old green screens of yore and the users of today. These tools capture the data directly from the legacy system and present it to users in an appealing graphical interface developed with current technologies and standards, while minimizing the amount of tinkering necessary in the background. In fact, the point of contact with the legacy system can be completely revamped without modifying it in any significant way. Taking this route automatically allows employees and users to access the system from any device that supports a browser, while giving the IT staff time to gradually transition out of the legacy system, but only for those modules that need changing. This solution is not always the best one—sometimes, it does make sense to replace the old system with a new one, just as replacing an old car is more cost effective than buying the old car a new transmission. Other times, however, companies are faced with systems that are ugly, but otherwise work fine. The system might be mission-critical, and thus the replacement process must be approached with extreme care. Or the necessary resources (budget, time) may not be there at this time. In those scenarios, on the other hand, extending the life of those systems with new and updated interfaces may be the way to go.
Hannaford Bros.: The Importance of Securing Customer Data Hannaford Bros. may have started as a fruit and vegetable stand in 1883, but it has expanded from its Maine roots to become an upscale grocer with more than 160 stores throughout Maine, Massachusetts, New Hampshire, upstate New York, and Vermont. In March 2008, the supermarket chain disclosed a data security breach; Hannaford said in a notice to customers posted on its Web site that unknown intruders had accessed its systems and stolen about 4.2 million credit and debit card numbers between December 7 and March 10. The breach affected all of Hannaford’s 165 supermarkets in New England and New York, as well as 106 stores operated under the Sweetbay name in Florida and 23 independently owned markets that sell Hannaford products. In a likely precursor of what was yet to come, two class-action lawsuits were filed against the company within the week. The filers argued that inadequate data security at Hannaford had resulted in the compromise of the personal financial data of consumers, thereby exposing them to the risk of fraud. They also claimed the grocer also appeared not to have disclosed the breach to the public quickly enough after discovering it. Even though the Hannaford breach is relatively small compared with some other corporate security problems, it is likely to result in renewed calls for stricter regulations to be imposed on companies that fail to protect consumer data. In addition to facing the likelihood of consumer lawsuits, retailers who suffer breaches have to deal with banks and credit unions, which are getting increasingly anxious about having to shell out tens of thousands of dollars to pay for the cost of notifying their customers and reissuing credit and debit cards. Retailers, on the other hand, have argued that the commissions they pay to card companies on each transaction are supposed to cover fraud-related costs, making any additional payments a double penalty. They also have said that the only reason they store payment card data is because of requirements imposed on them by the major credit card companies. While the ultimate impact of these and other security breaches may be hard to quantify, it represents one of the most important challenges resulting from the ubiquitous use of electronic transaction processing and telecommunication networks in the modern networked enterprise, and one that is likely to keep growing every day. The security of customer and other sensitive data also represents one of the primary concerns of IT professionals.
The Critical Role of Business Analysts For two decades, the CIO has been viewed as the ultimate broker between the business and technology functions. But while that may be an accurate perception in the executive boardroom, down in the trenches, business analysts (BA) have been the ones tasked with developing business cases for IT application development, in the process smoothing relations among competing parties and moving projects along. The 21st century business analyst is a liaison, bridge, and diplomat who balances the oftentimes incongruous supply of IT resources and demands of the business. A recent Forrester Research report found that those business analysts who were most successful were the ones who could “communicate, facilitate and analyze.” The business analyst is a hot commodity right now due to business reliance on technology, according to Jim McAssey, a principal at The W Group, a consulting firm. “The global delivery capabilities of technology today make the challenges of successfully bridging the gap between business and IT even harder,” he says. “Companies typically don’t invest in an IT project without a solid business case,” says Jeff Miller, senior vice president of Aetea, an IT staffing and consulting firm. A good business analyst is able to create a solution to a particular business problem and act as a bridge to the technologists who can make it happen. “Without the BA role, CIOs are at significant risk that their projects will not solve the business problem for which they were intended,” says Miller. The ideal candidate will have 5 to 10 or more years of experience (preferably in a specific industry), a technical undergraduate degree, and an MBA. Strong risk-assessment, negotiation, and problem-resolution skills are key, and hands-on experience is critical. Business analysts must be process-driven and able to see a project through conflict and change, from start to finish. “The BA also must have the ability to learn new processes,” says Miller. “A good BA learns business concepts and can quickly relate them to the specific needs of the project.” In the end, the more business technology analysts that are working in the business, the better off the CIO and IT function will be—whether the business technology analysts are reporting into IT or the business side. That’s because those IT-savvy analysts, who will have a more in-depth understanding of and more expertise in technologies, will “ultimately help the business make better decisions when it comes to its interactions with IT,” contend the Forrester analysts. And “CIOs have new allies in the business.” Salaries range from $45,000 (entry level) to $100,000 (senior business analyst) per year.
Upgrade Your Legacy Systems in Three Steps Replacing existing hardware and software with new technologies is both difficult and painful. Even when everybody involved (IT staff, users, other stakeholders) agree that is the best way to move forward, it is still painful. Specifying and implementing a replacement system is fraught with difficulties and requires some tough decisions. Robert C. Seacord of Carnegie Mellon University’s Software Engineering Institute outlines a three-step process for moving forward:
Analyze the Current System in Place. The focus should be on what the existing application is doing, rather than on the technology on which it has been implemented, which users are sometimes fast to pinpoint as the culprit. Sometimes, however, the current system may no longer fulfill user needs. Or, the business is growing too rapidly and the technology does not scale well. Or, the system is no longer supported or understood. “We were facing a system running on a DOS- and Fox-based platform that no one knew much about, and the only guy who did understand it was about to retire,” says Gerry Doan of Santa Cruz County, California, who supports a 500-person department. Other times, systems become outdated due to changes in hardware, as companies facing the move to 64-bit operating systems are starting to discover (16-bit applications will no longer run on those). The key issue in this step is to understand why it is that the current system should be replaced, as those reasons will have a major influence on what the new system should do and look like.
Understand the Requirements for the New System. All system changes, even seemingly simple performance upgrades, require an understanding of what users actually do with them. One key indicator that an application is no longer satisfying the needs of the user community is the existence of “off-system” workarounds in the form of spreadsheets and databases that users have developed to cope with the shortcomings of the current technology. These are often a rich source of ideas for what the new system should do. Standardizing on a small set of key technologies may also simplify the life of everyone involved, both users and IT staff. “Our management’s intention is to mainstream as much technology as possible so everyone in IT will be using the same skill set,” says Santa Cruz County’s Doan. It’s a lot easier to maintain and upgrade systems when they are all the same.
Make the Business Case for the New Project. Cost-benefit analyses of upgrading legacy technologies are not always as straightforward as they seem. Some costs are easy to measure; for example, the cost of hardware and packaged software upgrades. Others, like the cost of the migration itself, are hard to estimate. And if understanding costs is tough, benefits may feel like mostly guesswork, since so many of them are really hard to quantify. Sometimes a single issue may be important enough to make the project be worth undertaking. When Santa Cruz County upgraded, “we lost the requirement to support a handful of nonstandard machines, that included some Windows 95 equipment, and a computer running an older Linux version,” says Doan. That was a big relief to the busy department. And then there are the unexpected benefits. After users get familiar with the new technology, they can start doing things nobody had originally envisioned. Santa Cruz “users had to be taught new ways to get to their data,” says Doan, “but the payoff was new things they could do that they couldn’t do before.”
Chapter 2 | Competing with Information Technology
Boeing: Saving Big by Cutting Imaging Costs Hitting “Ctrl1P” can cost your business more than you think. It certainly did at aerospace giant Boeing. Imaging services—which includes production printing, office printing, faxing, scanning, and related supplies—used to cost the company nearly $150 million annually. The problem, says Earl Beauvais, Boeing’s director of print, plot, and scan services, was that imaging wasn’t centrally controlled, and the company used several vendors. Boeing also owned, operated, and maintained about 32,000 imaging devices. The lack of an enterprise wide solution meant, among other things, that each department was responsible for purchasing its own toner, paper, and other supplies. To increase efficiency and reduce cost, Beauvais and his team sought a managed services solution to handle everything from print cartridges to printer upkeep across Boeing’s 195 domestic sites and 168 international sites. Beauvais spent 18 months researching and interviewing vendors, who had to show how they would manage the company’s imaging technology needs while providing the greatest efficiency at the best price. He and his team chose a partnership comprising Dell (for maintenance and asset management) and Lexmark (for devices). They picked them in part because Dell had infrastructure in place at Boeing. To prove the concept, a six-month pilot implementation launched at Boeing’s St. Louis office in May 2007. The St. Louis system included 47 new Lexmark device categories, including printers, copy machines, and scanners. “We replaced the devices because we didn’t want variability of age,” says Beauvais. The beauty of managed services is that Dell owns the devices and handles maintenance, a key goal for Beauvais. Boeing saw ROI immediately because Dell’s service contract cost less than its existing agreements. In the end, Boeing saved about 30 percent of its imaging maintenance and supplies costs, and 27 percent of its overall imaging costs annually at locations with the new system. The initiative began rolling out companywide at the end of 2007. For Boeing, the benefits couldn’t be clearer. Beauvais’s staff can now focus more on other business needs, and the company’s total imaging spending has been reduced to $110 million annually. Both will aid Boeing as it navigates a turbulent economy.
Competitive Advantage and Competitive Necessity The constant struggle to achieve a measurable competitive advantage in an industry or marketplace occupies a significant portion of an organization’s time and money. Creative and innovative marketing, research and development, and process reengineering, among many other activities, are used to gain that elusive and sometimes indescribable competitive advantage over rival firms. The term competitive advantage is often used when referring to a firm that is leading an industry in some identifiable way such as sales, revenues, or new products. In fact, the definition of the term suggests a single condition under which competitive advantage can exist: When a firm sustains profits that exceed the average for its industry, the firm is said to possess competitive advantage over its rivals. In other words, competitive advantage is all about profits. Of course, sales, revenues, cost management, and new products all contribute in some way to profits, but unless the contribution results in sustained profits above the average for the industry, no measurable competitive advantage has been achieved. The real problem with a competitive advantage, however, is that it normally doesn’t last very long and is generally not sustainable over the long term. Figure 2.6 illustrates this cycle. Once a firm figures out how to gain an advantage over its competitors (normally through some form of innovation), the competitors figure out how it was done through a process referred to as organizational learning. To combat the competitive advantage, they adopt the same, or some similar, innovation. Once this occurs, everyone in the industry is doing what everyone else is doing; what was once a competitive advantage is now a competitive necessity. Instead of creating an advantage, the strategy or action becomes necessary to compete and do business in the industry. When this happens, someone has to figure out a new way to gain a competitive edge, and the cycle starts all over again. Every organization is looking for a way to gain competitive advantage, and many have successfully used strategic information systems to help them achieve it. The important point to remember is that no matter how it is achieved, competitive advantage doesn’t last forever. Arie de Geus, head of strategic planning for Royal Dutch Shell, thinks there may be one way to sustain it: “The ability to learn faster than your competitors may be the only sustainable competitive advantage in the future.” This suggests an important role for information systems if any competitive advantage is to be achieved.
Universal Orlando: IT Decisions Driven by Customer Data Michelle McKenna is the CIO of Universal Orlando Resort, but she is also a mother of two and the planner of family vacations. In fact, she thinks of herself first as a theme park customer, second as a senior leader at Universal, and finally as the company’s CIO. “Recently we were brainstorming new events that would bring more Florida residents to our theme parks during off-peak tourist periods. Our in-house marketing group was pitching proposals, and I offered the idea of a Guitar Hero competition. Everyone loved it. But that idea didn’t come from being a CIO—it came from being a mother of two,” she says. “Thinking like our customers and focusing on our company’s markets are among the most important ways we can fulfill our responsibility to contribute to informed decision making,” says McKenna. Moving forward, it’s more critical than ever for CIOs to study market trends and find ways to maximize business opportunities. Universal Orlando is one of many brands in the travel and entertainment industry competing for discretionary dollars spent by consumers on leisure time and vacations. Of course, the competition boils down to a market of one—the individual consumer. People often assume that because of the high volume of guests, the experience at Universal Orlando has to be geared for the masses. But digital technology now enables guests to customize their experience. For example, the new Hollywood Rip Ride RockIt Roller Coaster will allow guests to customize their ride experience by choosing the music that plays around them while on the roller coaster. When the ride ends, guests will be able to edit video footage of that experience into a music video to keep, share with friends, or post online. Any CIO can take a few steps to get market savvy. Management gets weekly data about what happened in the park and what the spending trends are per guest. CIOs should get copied on any reports like that. They should study them and look for patterns. “Don’t be afraid to ask questions about it; give yourself permission to be a smart (and inquisitive) businessperson. When I first joined the company and asked about market issues, people looked at me and thought, ‘Why did she ask that? It doesn’t have anything to do with technology.’ Over time they realized that I needed to understand our data in order to do my job,” says McKenna. Knowledge of market data helps Universal Orlando drill down to understand what is really happening in business. For example, trends indicated that annual pass holders—Florida residents, primarily—spend less on food, merchandise, and other items than day-pass guests. It turned out that some pass holders do spend on par with day guests, particularly when they attend special events, Mardi Gras, and Halloween Horror Nights. “This analysis showed that we needed to segment those annual pass holders more deeply in order to better understand them and market to them. So we are building a new data warehouse and business intelligence tools that will calculate spending by hour and by pass type. The initiative started in IT, and we can find many similar opportunities if we look at market details and ask questions,” McKenna says.
To Build or to Buy—Is That Really the Question? To build or to buy? Software, that is. This is one of the most enduring, persistent questions in the world of IT. Should a company license (i.e., buy) a commercial application that will do 75 percent of what is needed, or should it develop its own applications that will support the requirements as closely as possible? The traditional answer has been that one buys to standardize—that is, to automate necessary but not strategic business processes—and builds to compete—that is, to support those core processes that make your company different from its competitors. But things may be more complicated than that. In some cases, homegrown systems may currently be handling menial, less-than-strategic tasks, but switching costs make it difficult to replace them with commercial software. In other cases, packaged software may do exactly what the company needs, even if those needs are strategic in nature. Then, why develop your own? Many IT executives will evaluate commercial software before even considering building their own. Buying commercial software as often as possible frees up resources for those times when you really, really need to build your own software. When making those decisions, it is important to understand the entire life cycle of software applications and not only the development stage. Many applications will last at least seven or eight years, and due to ongoing maintenance and improvements, 70 percent of the costs will be incurred after the software has been officially implemented. It thus seems that buying or building may be much more complicated than previously thought. Consider financial services giant Visa. Due to a major emphasis on security, reliability, and privacy concerns of its customers, Visa has an IT organization that is historically biased toward building in-house. This is also due to the sheer size of their global financial network—when you are that large, there is really nobody else you can turn to; all other provider organizations are small in comparison, and the benefits are just not there. However, even an IT organization that traditionally builds can turn to commercial software—even open-source software, of all things—when the economics make sense. Infrastructure and tools are just an example. “They work, and there is no competitive advantage to build,” says David Allen, a consultant who served as Visa’s CTO for three years. “Those systems are built at a scale because you’re leveraging the technology across many companies.” Further, the company has embraced the availability of mature and reliable open-source tools, particularly in areas such as development, databases, and programming languages. “The combination of low-cost tools and having the source code available can be like getting the best of both worlds [of buying and building],” Allen says. “We have gotten as good if not better in deploying new services on open source as on commercially available software like Windows.” The best bet may be to put together all available data about business processes, assets and requirements, people, software, hardware, architecture, and compliance and present it with detailed alternatives—each with its advantages and disadvantages, benefits and consequences—to the business stakeholders, and let them make that decision. Inevitably, however, politics will raise its ugly head sooner or later.
Sysco: Transforming a Company with the Help of IT Sysco, headquartered in Houston, Texas, is a major distributor of food products to restaurants, schools, hospitals, and hotels and is also a provider of equipment and supplies to the hospitality and food service industries. Sysco employs approximately
45,000 with sales for the 2010 fiscal year surpassing $37 billion. It operates more than 180 locations in the United States, Canada, and Ireland, from which it serves more than 400,000 customers. The company is organized into a series of large operating companies with geographical responsibilities, and a smaller group of specialty food companies that cater to particular segments of the market. Sysco has embarked on a new project to standardize and unify business processes across its operating companies and distribution centers. The overarching goals of the effort are to increase efficiency and improve sales and marketing, as well as provide increased transparency through improved data management. Not surprisingly, IT is an integral part of this transformation. “This is more than an IT project—it is truly a business transformation,” says Jim Hope, executive vice president. “Using the power of SAP as the foundation for our transformation, Sysco intends to improve productivity, retain and expand business with existing customers, and understand where market opportunities lie so we can do a better job attracting and pursuing new business.” The company chose SAP Business Suite and Business Objects business intelligence platforms as the centerpieces after a series of pilots and demonstrations convinced senior management this was the way to go. As everybody in IT knows, one of the most important keys to the success of this kind of large project is executive commitment. And so Sysco started by getting this first, and then figuring out the details later. “We’re starting to pilot some of the customer-facing applications, and in particular, an improved ordering platform for our customers,” said Mark Palmer, vice president of corporate communication. “So far, we’re very happy with what we’re seeing.” These “details” include a four-pronged approach focused on getting more and better information into the hands of their sales associates (Sysco’s main point of contact with customers), a Web-based complete order management system that will also assist customers with personalized recommendations, a consolidated back office that will be shared by all affiliates, and standardized reporting across the company that will provide management with up-to-date information on all aspects of operations. And all this will be done with commercial software that will replace existing stand-alone systems that just could not deliver anymore. “We have a tremendous opportunity to use technology to continue to sharpen our operations,” says Twila Day, senior vice president and chief information officer. “SAP is the best technology provider to help us with our plans to integrate all of our software needs into a single platform, giving us the visibility required to efficiently manage our business end-to-end.”
Goodwin Procter Makes a Strong Case for Knowledge Management If anyone knows that time is money, it’s an attorney. The 850 attorneys and their staff at Goodwin Procter LLP were spending too much time assembling documents and looking up information, which meant cases took more time than they should to proceed. The $611 million law firm’s eight offices used seven different applications to manage more than 2 terabytes of data for Goodwin Procter’s more than 60,000 cases—close to 10 million documents. CIO Peter Lane wanted to integrate the data. Using Microsoft SharePoint, his team created the Matter Page System as a hub through which attorneys could access business data and client information. What’s more, the firm has been able to use the platform to share its notes and work in progress. It’s now easy for an attorney to find a colleague who can help with a similar case. Matter Pages took a year to implement, but it immediately changed how Goodwin Procter’s attorneys work. When a client called with a question, finding the answer used to mean launching more than one application and looking up the data in different systems. Attorneys needed contact information, documents, billing information, and more. The process sometimes took hours. “Now, instead of having to launch the different systems from the desktop, or the Web interface, or the document management system, we were able to pull all of this information into a one-stop-shop view for the users in our company,” says Andrew Kawa, Goodwin Procter’s development manager, who leads its system development efforts. The system increases efficiency for the attorneys because they can find previous matters that they or others have worked on and gain extra information much more quickly than before. They spend less time researching and more time moving a case forward. The initial success of Matter Pages has Lane investigating new SharePoint features, such as wikis and blogs. He expects to deploy these new capabilities widely over the next few months. For example, each matter has a wiki that is used to track notes, or other unstructured data that relates to it. These notes are open for editing by all users. Blogs tend to be used for discussions that are not case-specific, although when a matter or set of matters apply to the topic of the blog, users can add links to related cases. “One of the IT goals is to take advantage of the new technology as it becomes available,” Lane adds. With that goal in mind, says Lane, the Matter Pages System won’t ever truly be completed. Currently, Kawa is looking to integrate Goodwin Procter’s patent and trademark information with data about their patent applications from the U.S. Patent and Trademark Office. The integration would allow attorneys to retrieve real-time information on their pending patents and actions they need to take. “I don’t think we will ever declare the project done or say we don’t have to put any more time or effort in,” he says.
Chapter 3 | Computer Hardware
Corporate PC Criteria What do you look for in a new PC system? A big, bright screen? Zippy new processor? Capacious hard drive? Acres of RAM? Sorry, none of these is a top concern for corporate PC buyers. Numerous studies have shown that the price of a new computer is only a small part of the total cost of ownership (TCO). Support, maintenance, and other intangibles contribute far more heavily to the sum. Let’s take a look at four top criteria. Solid Performance at a Reasonable Price. Corporate buyers know that their users probably aren’t mapping the human genome or plotting trajectories to Saturn. They’re doing word processing, order entry, sales contact management, and other essential business tasks. They need a solid, competent machine at a reasonable price, not the latest whizbang. Many organizations are adopting a laptop, rather than desktop, strategy. Using this approach, the employee uses his or her laptop while in the office and out in the field. With the proliferation of wireless Internet access, this strategy allows employees to take the desktop with them wherever they may be—at their desk, in a conference room, at a meeting off-site, or in a hotel room in another country. One outcome of this strategy is the development and acquisition of more powerful laptops with larger and higher-quality screens. This demand presents a challenge to laptop manufacturers to provide higher quality while continuing to make the laptop lightweight and portable. Operating System Ready. A change in the operating system of a computer is the most disruptive upgrade an enterprise has to face. That’s why many corporate buyers want their machines to be able to handle current operating systems and anticipate new ones. Although most organizations have adopted Windows XP or Vista, some enterprises still use operating systems of an earlier vintage. Ultimately, they must be able to make the transition to Windows 7 (the newest OS from Microsoft) and even to OS versions expected three to five years from now. Primarily, that demand means deciding what hard disk space and RAM will be sufficient. Connectivity. Networked machines are a given in corporate life, and Internet-ready machines are becoming a given. Buyers need machines equipped with reliable wireless capabilities. With fewer cables to worry about, wireless networks, especially when combined with laptop PCs, contribute to the flexibility of the workplace and the simplicity of PC deployment. Many organizations are planning for Internet-based applications and need machines ready to make fast, reliable, and secure connections. Security-Equipped. Most of the data that is processed by networked workstations in a modern corporate environment can be considered proprietary, if not mission-critical. A major criterion for corporate purchase is the degree to which the device can accept or conform to the myriad of security measures in use in that organization. Can it accept a USB dongle, smartcard reader, biometric access device, and so forth? We will cover this aspect in greater detail in Chapter 13.
Supercomputers Aid Satellite Launches
Satellite launches are a noisy affair, especially for the satellite atop the rocket. Vibration and noise, unless compensated, could render it useless before it reaches orbit, so researchers spend a lot of time on complex computer simulations that help them insulate the delicate craft. Now those simulations are about to get much more accurate, thanks to a new supercomputer that recently began work in Japan. The Fujitsu FX1 computer was inaugurated in 2009 by the Japan Aerospace Explorations Agency (JAXA). It has 3,008 nodes, each of which has a 4-core Sparc64 VII microprocessor. The machine has 94 terabytes of memory and a theoretical peak performance of 120 teraflops. Running standard benchmarks, it achieved a peak performance of 110.6 teraflops, which ranks it not only the most powerful machine in Japan but also the most efficient supercomputer in the world. Its peak performance represents 91.2 percent of its theoretical performance and outranks the previous record holder, a machine at the Leibniz Rechenzentrum in Munich. Ranked below the German computer is another JAXA machine. “Performance is about 15 times higher than the system we had before,” said Kozo Fujii, director of JAXA’s Engineering Digital Innovation Center. Two rows of computer racks make up the main system, and a third row alongside is a second, less powerful FX1 machine. In an adjoining room sits an NEC SX-9 vector computer for running specialized tasks and the storage that augments the entire system. All together, a petabyte of disk storage space and 10 petabytes of tape storage are connected to the system (a petabyte is a million gigabytes). And between the lot, there are many big, industrial air conditioners to keep the room cool and extract the heat generated by this mass of hardware. JAXA intends to put it to work on simulations such as the acoustic noise experienced by a satellite at launch, said Fujii. “There is a wide band of frequencies and usually the peak frequencies are located between 60 and 100 Hertz and we can capture at that level of frequencies. But hopefully with the new computer we can capture frequencies of 150 or 200 Hz that are difficult for the current computer.”
Gati Limited: Real-Time Delivery with Handheld Technology Gati Limited is one of the leading distribution and supply chain solutions companies in India. Launched in 1989 as a small cargo management company, Gati has grown to employ more than 3,500 people. Reaching more than 90 percent of India, Gati operates a fleet of more than 4,000 trucks of various capacities—containers, refrigerated, freight, and so forth—and does it all, from flexible point-to-point services to complex logistics and supply chain management. Rapid growth, however, is not without its disadvantages. As the company reached farther and farther into the country, it sought to keep the same level of customer service for which it had always been known: If Gati said it would be delivered, then it would be. And on time. At the center of the freight business lies the POD, or proof-of-delivery, document. This piece of paper, when signed by the recipient acknowledging the time and completeness of the delivery, provides evidence to all involved in the transaction that Gati fulfilled its part of the bargain. This document, however, needs to find its way back to the sender to be of any use. With hundreds of thousands of shipments somewhere in the country at any given time—the company covers 3.2 lakh, or 320,000, kilometers every day—the resources needed to get ship those documents around are substantial, and time-consuming. For instance, it took Gati about three days to send the PODs back to the original shippers. In this day and age, that just does not cut it anymore. “We needed to provide customers with real-time delivery results and eliminate the risk of losing physical copies of PODs,” says G.S. Ravi Kumar, CIO. Any solution would need to be relatively inexpensive and easy to implement across a workforce that was largely nontechnical. Gati opted for inexpensive handheld devices with GPRS (General Packet Radio Service, a 2G/3G robust packet mobile data service that supports Internet Protocols ) and image-capturing functionality. As deliveries are made, the driver captures an image of the signed POD document and other package information and transmits it to a central database. Many benefits accrued to Gati out of this simple design. For example, Gati no longer needs an army of data entry operators who would type details in the physical PODs to update the data stores of the company. Delivery information is now available almost in real time, and the costs of tracking and shipping physical PODs have largely disappeared. Although sometimes GPRS connectivity would be a challenge in rural areas, the company also developed a way to work around this issue: Custom software would capture the image and delivery information and then transmit it when it detected that connectivity had been established again. This way drivers could continue their routes without worrying about any of these issues. “Today, we have eliminated the cost of couriering PODs, and updated delivery information is almost in real time,” says G. S. Ravi Kumar. The entire project, which cost Rs 90 lakh (about 200,000 dollars) was rolled out to more than 240 locations and more than 900 users, processing over 20,000 transactions a day. The simplicity of the solution was largely responsible for Gati Limited achieving ROI in less than six months.
Forget the ATM: Deposit Checks without Leaving Home First, we didn’t need to visit the bank teller anymore. Then we were able to stick our checks right into the ATM without an envelope. Now we won’t have to leave the house to make deposits. Based in Sacramento, California, Schools Financial Credit Union is one of the latest banks to allow customers to scan checks at home and deposit them over the Internet. Golden One Credit Union, also from California, had introduced scanner-based check deposits in July 2009. “Banking’s not the way it was 5 or 10 years ago,” said Nathan Schmidt, a vice president at Schools Financial. “With any type of technology, it becomes more convenient to self-service.” Even with the widespread use of direct deposit and online banking, people still write and receive millions of paper checks each year. And for the most part, when we have to deposit a paper check, we still need to go to an ATM to do it. Businesses have been making deposits over the Internet far longer, ever since the passage in 2004 of the federal Check 21 Act, which made a digital image of a check legally acceptable for payment. Businesses quickly saw the benefits of the new law. Sending checks as digital images eliminated courier costs and paperwork. The extension of the service to consumers has come much more slowly. Cary Whaley, a director at Washington, D.C.–based Independent Community Bankers of America, says financial institutions have been wary about potential fraud. “For many banks, it remains a business application,” Whaley says. “The next step is the consumer side, but a lot of community banks are a little wary. When you’re getting into thousands of consumers, the challenge for banks and credit unions is not only monitoring risk, but monitoring for changes in transactions and transaction amounts.” But some bankers say consumers are increasingly demanding the same convenience given to their business counterparts, and it’s just a matter of time before remote deposits become much more widespread. When Schools Financial Credit Union decided to take the plunge, it included safeguards to prevent abuse. Customers must use their existing secure online banking log-in, and they can’t transmit items more than twice a day. Users have a time limit to scan and deposit the check online, and checks must meet specific requirements before they are deposited. Post-dated, damaged, or lightly printed checks, for instance, will not scan properly and cannot be deposited. “So many people prefer to do self-service. They choose to go online—maybe they’re parents with small kids, or they might not want to go to an ATM at 3 a.m.,” says Golden One’s chief executive officer, Teresa Halleck. “People are already online,” she says. “They’re comfortable with electronic delivery and they’re looking for more.”
Work 7324: Collaboration Technology for Small Companies Technology is becoming more and more pervasive in all industries throughout the world. All companies that are either very large, large, or medium-sized are already greatly based on technology in one way or the other. On the consumer side of things, new products are introduced almost every day, from social media innovations to tablets to cloud-based music services. Many of these products are based on mobile platforms that come with cameras and GPS systems as defaults. But smaller companies have yet to fully embrace all of these developments. “There are a lot that focus on consumers, but nothing for smaller enterprises and the like,” says Jens Lundström, CEO of Two Story Software, the Swedish start-up behind Work 7324, a service that allows small businesses with employees out on the field to collaborate and keep everybody updated without the need to fill out paperwork. To make things easier for smaller businesses with more limited resources and technical expertise, the service is hosted remotely and can be accessed from any device with a connection to the Internet. “Usually handymen, including electricians, plumbers, and installers, have a mobile phone and a notepad, and that is their back office support. When they head out in the field they have work orders written on a note,” says Lundström. The service is targeted to organizations where a large portion of their workforce does not work at an office all day, but is rather out in the field and, most often, out of touch with headquarters for hours at a time. The goal is to take advantage of already built-in features of mobile phones to give these smaller businesses a leg up against larger operations. Work 7×24 allows managers to enter new work orders or update existing ones from the office and then notifies employees through an e-mail or text message, which contains a link that can be used to access the update and enter any necessary information. It can also be used to provide information on the customer or location the employee is about to visit. Any images or instructions that can be of help can also be included. After the visit to the customer is over, employees can use the same service to upload information back to the office, which can then be shared with other employees in the future. For example, salespeople can include photographs of new equipment installed at a customer site so that repair technicians know what to expect when they go out in the field to service it. They can also include useful nearby locations, or geo-tagged directions when customers are located in remote and hard-to-access rural areas. Work 7324 also includes support for quotes and invoicing on the spot, as well as reporting of any materials used and the number of hours worked. And the best part? It is all paperless. “Our goal is to make it simpler for small companies to work together and document what they have done, and do it faster with less paperwork,” says Lundström.
Kimberly-Clark: Secrets to RFID Success While you may not immediately recognize the name Kimberly-Clark, you have surely purchased one or more of its products: Scott, Huggies, Pull-Ups, Kleenex, and Kotex are some of its better-known brands. In fact, the company holds the number 1 or 2 spot in market share in more than 80 countries. At last count, more than 1.3 billion people use its products every day, which helps explain how sales for 2010 topped $19.7 billion. None of this would be possible without a razor-sharp global supply chain that coordinates manufacturing operations in 37 different countries, and distribution and sales in 150 countries overall—that is pretty much every country on the planet. Radio Frequency Identification (RFID) is the key to that effort. Kimberly-Clark was one of the first large companies to get heavily involved with RFID, and it has been a strong believer in the technology. “Our goal is to evolve the capabilities of our supply chain to a demand-driven network. One of the keys to achieving that vision is to have a highly integrated suite of supply chain systems that provide end-to-end visibility and as close to real-time information as possible,” says Mark Jamison, vice president of customer supply chain management at Kimberly-Clark. How much “real-time” should real-time data be is an important issue. It is unlikely that most companies will need to know, on an hour-by-hour basis, what is going on with their products and customers; getting that information two or three times a day may be enough to understand the market. Yet being able to see how promotions are taking off, which products are moving more or less than expected, and how production volumes are coming along allows Kimberly-Clark to keep its products always on the shelf in the right quantity— so they’re are not too many and not too few. “Our strategy around RFID has been to focus on business processes and develop repeatable, scalable business processes that are enabled by the technology. The technology in and of itself is not going to bring value to the supply chain. The value to the supply chain comes from reengineering your business processes and enabling those new business processes to work with the technology,” notes Jamison. Consider, for example, product promotions and their delivery and execution. Kimberly-Clark managers found out that their promotional materials and displays arrive in the stores only 55 percent of the time to meet promotion or advertising dates. Only 55 percent of the time! The company redesigned the process to start tracking these materials using real-time data. They began to chart progress against plans, and they also involved their retail operation employees, who are the ones in daily contact with the stores and chains. Now, on a day-to-day basis, managers know which stores have not executed the promotions as necessary, and they can then send people to make those promotions happen immediately. Timely execution of promotions went up to 75 percent afterwards, resulting also in increased sales. And all of this was possible thanks to the timely data afforded by their RFID deployment. “In the supply chain, potentially, we could bring RFIDs back into the manufacturing environment, and trace raw materials. We’ve found that the bigger payback in the short term for us has been reducing out-of-stocks on the shelf. But we believe there are a lot more opportunities with RFID,” says Jamison.
Computers Will Enable People to Live Forever In just 15 years, we’ll begin to see the merger of human and computer intelligence that ultimately will enable people to live forever. At least that’s the prediction of author and futurist Ray Kurzweil. Kurzweil suggests that nanobots will roam our bloodstreams, fixing diseased or aging organs, while computers will back up our human memories and rejuvenate our bodies by keeping us young in appearance and health. The author of the book The Singularity Is Near, Kurzweil says that within a quarter of a century, nonbiological intelligence will match the range and subtlety of human intelligence. He predicts that it will then soar past human ability because of the continuing acceleration of information-based technologies, as well as the ability of machines to share their knowledge instantly. Kurzweil predicts people and computers will intermix with nanobots, blood cell-sized robots, that will be integrated into everything from our clothing to our bodies and brains. People just need to live long enough—another 15–30 years—to live forever. Think of it as replacing everyone’s “human body version 1.0” with nanotechnology that will repair or replace ailing or aging tissue, he says. Parts will become easily replaceable.“ A $1,000 worth of computation in the 2020s will be 1,000 times more powerful than the human brain,” says Kurzweil, adding that in 25 years we’ll have multiplied our computational power by a billion. “Fifteen years from now, it’ll be a very different world. We’ll have cured cancer and heart disease, or at least rendered them to manageable chronic conditions that aren’t life threatening. We’ll get to the point where we can stop the aging process and stave off death.” Actually, we’ll hit a point where human intelligence just can’t keep up with, or even follow, the progress that computers will make, according to Kurzweil. He expects that nonbiological intelligencewill have access to its own design plans and be able to improve itself rapidly. Computer, or nonbiological, intelligence created in the year 2045 will be one billion times more powerful than all human intelligence today. “Supercomputing is behind the progress in all of these areas,” says Kurzweil, adding that a prerequisite for nonbiological intelligence is to reverseengineer biology and the human brain. That will give scientists a “toolkit of techniques” to apply when developing intelligent computers. In a written report, he said, “We won’t experience 100 years of technological advance in the 21st century; we will witness on the order of 20,000 years of progress, or about 1,000 times greater than what was achieved in the 20th century.” According to Kurzweil, here’s what we can expect in the not-so-distant future:
• Doctors will be doing a backup of our memories by the late 2030s.
• By the late 2020s, doctors will be sending intelligent bots, or nanobots, into our bloodstreams to keep us healthy, and into our brains to keep us young.
• In 15 years, human longevity will be greatly extended. By the 2020s, we’ll be adding a year of longevity or more for every year that passes.
• In the same time frame, we’ll routinely be in virtual reality environments. Instead of making a call on a cell phone, we will “meet” someone in a virtual world, take a walk on a virtual beach, and chat. Business meetings and conference calls will be held in calming or inspiring virtual locations.
• When you’re walking down the street and see someone you’ve met before, background information about that person will pop up on your glasses or in the periphery of your vision.
• Instead of spending hours in front of a desktop machine, computers will be more ingrained in our environment. For instance, computer monitors could be replaced by projections onto our retinas or on a virtual screen hovering in the air.
• Scientists will be able to rejuvenate all of someone’s body tissues and organs by transforming his or her skin cells into youthful versions of other cell types.
• Need a little boost? Kurzweil says scientists will be able to regrow our own cells, tissues, and even whole organs, and then introduce them into our bodies, all without surgery. As part of what he calls the “emerging field of rejuvenation medicine,” new tissue and organs will be built out of cells that have been made younger.
• Got heart trouble? No problem, says Kurzweil. “We’ll be able to create new heart cells from your skin cells and introduce them into your system through the bloodstream. Over time, your heart cells get replaced with these new cells, and the result is a rejuvenated, young heart with your own DNA.”
• One trick we’ll have to master is staying ahead of the game. Kurzweil warns that terrorists could obviously use this same technology against us. For example, they could build and spread a bioengineered biological virus that’s highly powerful and stealthy. According to Kurzweil, we’re not that far away from solving a medical problem that has plagued scientists and doctors for quite some time now: the common cold. He notes that although nanotechnology could go into our bloodstreams and knock it out, before we even get to that stage, biotechnology should be able to cure the cold in just 10 years.
Chapter 4 | Computer Software
SAP Business Suite7: Introducing Modular Scenarios Cutting across Organizational Functions Germany-based SAP AG is tackling business processes in a novel way with the newest version of its Business Suite, which embeds analytics acquired from Business Objects SA and introduces industry-specific “value scenarios.” Version 7.0 of SAP Business Suite, a library of business processes, adds industry best practices through more than 30 modular value scenarios—such as Superior Customer Value and Product Lifecycle Management (PLM)—designed to cross traditional organizational boundaries. These “ predefined end-to-end business processes” are intended to be implemented in small steps by organizations as they need it, says Jim Hagemann Snabe, SAP executive board member. The value scenarios basically illustrate interrelationships between SAP product capabilities using graphical guides and business terms, not feature and function lists. The customer can also see the impact on the associated systems, and ultimately, the specific SAP modules that would need to be activated. Ray Wang, vice president at Cambridge, Massachusetts–based Forrester Research Inc., says customers will find the value scenarios “compelling as they align with the key business drivers users face.” But as with all best practices, Wang notes that “SAP will need to make it easy for customers to modify those scenarios, reduce the overall cost of owning SAP, and provide more frequent levels of innovation.” One customer, Colgate-Palmolive Co., has large implementations in CRM and PLM that would benefit from the new capabilities of version 7.0, says the company’s senior vice president of IT and business services, Ed Toben. “Particularly when you look at PLM, which is newer, the processes and the enhancement-pack concept of turning on pieces should make us move faster,” says Toben. Another customer, pharmaceutical company Roche, requires the flexibility and ability to scale as the business changes in order to remain current, says chief information officer Jennifer Allerton. “IT investments . . . have got to make sense in their own right,” she says. “And, the pharmaceuticals business is one where you invest for the long term and when you make investments about IT packages, you’re not going to change your mind the next day about them.” IBM Corp., also a customer, is focused on a number of transformation programs, including the area of operational efficiency, says Jeannette Horan, vice president of enterprise business transformation with the office of the chief information officer at IBM. To that end, the company’s strategy, says Horan, is to integrate the enterprise globally through common processes, using the Business Suite to “mix and match components of the business to go to market in new and interesting ways.” But while companies are taking a hard look at spending and reviewing projects, “that does not mean . . . that companies do not spend, they just spend very smartly and very wisely,” says Léo Apotheker, co-CEO of SAP AG. There is a need, says Apotheker, “to provide better and faster insight, a higher level of efficiency, a need to introduce a whole new degree of flexibility in the way we do business.”
McAfee Inc.: Security under a Software-as-a- Service Model Santa Clara, California–based security vendor McAfee Inc. released a Software as a Service (SaaS) Web security tool for protecting a distributed workforce from Web threats, while rendering IT departments fewer upfront costs in light of current budgetary constraints. Especially in tough economic times, a SaaS model of software delivery, like the McAfee Web Protection Service, saves cash-strapped organizations money because IT staff members don’t have to spend valuable time managing on-site equipment, says Mark Campbell, senior product marketing manager with McAfee Inc. “They get the advantages of having a tool that is always on, always up-to-date, and with uptime guarantees,” says Campbell. One challenge with on-premise tools, he continues, is that when vendors issue a feature update, a period of time can elapse before the enhancements are up and running in the environment, says Campbell. That problem goes away when the software is hosted centrally. Features of the hosted security offering include reputation-based filtering, based on McAfee’s reputation system Trusted Source, to block constantly morphing threats. There’s flexible policy manager for setting policies for certain employee groups like access to certain social networking sites by contract employees versus executive management. Users have the ability to run reports and use dashboards to gain insight into an organization’s Web usage. “Are employees spending all day on Facebook, and does this align with our appropriate usage of Web tools?” says Campbell. Other features include malware protection, remote office and user support, and transparent user authentication. James Quin, senior research analyst with London, Ontario–based Info- Tech Research Group Ltd., can’t yet say if McAfee’s SaaS offering will be cheaper in the long run given monthly recurring costs for the service. That said, in this climate of eliminated capital budgets, Quin says “a solution like this offers them value there.” Small organizations in particular, says Quin, will benefit from not having to retain as much security expertise. Offering a SaaS option for malware technology that is “pretty commoditized” is certainly a move by McAfee to differentiate itself in a crowded space, says Quin. “And it puts them out front first because they’re not going to be the last ones to offer this kind of service,” he says. Campbell thinks customers’ perception of hosted security products has changed for the better, helped along by the successful adoption of hosted CRM tools like salesforce.com . “More and more IT departments are beginning to accept and really realize the benefits of it,” says Campbell.
Australian Maritime Safety Authority: Cloud Computing? Nothing New The Australian Maritime Safety Authority (AMSA) is a self-funded government authority tasked with the provision of safety and administrative services to the Australian maritime industry. These include maritime safety, environmental protection, maritime and aviation search and rescue, and a host of clerical services such as ship registration, audit, and inspection. As part of the latter, AMSA operates the International Safety Management (ISM) program, which keeps track of audits to international ships according to agreed-upon industry standards and best practices. The IT behind this process records these audits and issues certificates of compliance that remain valid for a period of time.
Information is stored centrally in Canberra and accessed and collected from 14 different ports. While certainly important, this is a low-volume system. The AMSA makes about two dozen of these inspections a month. While the current application was based on Microsoft Excel, over time AMSA became interested in creating something with a Web presence, which would be easy to access from all locations and eventually by the public at large. Since most of its system development was done in-house at the time, AMSA started by looking there first, but did not see anything it liked. The traditional approach to system development would take about six months and cost between 200,000 and 300,000 Australian dollars (between about $210,000 and $315,000). AMSA therefore started looking for alternatives. In a conference in 2008 it learned about salesforce.com, one of the leading providers of cloud-based Software as a Service (or SaaS). At this time, developing the ISM application using the Force.com cloud computing platform would not only be a first in Australian government, it would be the first-ever Force.com development in the country. The better part? Both cost and time required were an order of magnitude smaller than what AMSA had estimated it would cost to develop the application internally. Salesforce.com consultants estimated it would take about six weeks of development time, which would cost 30,000 Australian dollars; this included development, training, and one year of licensing, with additional licensing and maintenance fees on an ongoing basis for about 8,000 Australian dollars a year. At these rates, the low estimate for the internal development alternative would cover development plus about 20-some years of licensing if going for the cloud-based alternative. While hard to pin down exactly, AMSA estimated ongoing maintenance costs for an internally developed application at about 20,000 Australian dollars a year. While not everything went perfectly—an important requirement was completely overlooked in the alpha version, which required extensive rework later on—business users of the new ISM application were quite satisfied with the many improvements over the existing Excel-based technology. One thing to keep in mind, however, is what would happen if AMSA decided to terminate the relationship. Since the application runs only on the Force.com platform, there is little that could be recovered other than the data contained in it, which would be returned to AMSA within 24 hours of termination. AMSA management underscores that it is important to go into this type of arrangement knowing how one would get out, if ever needed.
Toronto’s Hospital for Sick Children: Challenges in Making Virtualization Work Toronto’s Hospital for Sick Children has learned the hard way that virtualization efforts won’t be successful if vendors aren’t ready to support you, according to its director of technology, Ana Andreasian. The hospital (usually referred to as “Sick Kids”) has already consolidated a considerable amount of its server infrastructure, which now includes 300 physical and 60 virtual machines. Sick Kids employs about 110 IT staff members who serve more than 5,000 employees. Andreasian said the biggest issue she’s experienced so far has come from vendors who do not properly test their applications before offering them to virtualization customers. “They’ll say, ‘Give me one CPU, one gig of memory, and I’m good,’” she says. “Then you’ll find they need four CPUs and four gigs of RAM. You wind up having a never-ending discussion on how to solve the performance problems.”
Another challenge has been vendors who say they’re willing to support virtual environments, but not fully. “Some vendors have a condition: if you have a problem, you have to move (the application) out of a virtual environment,” she says. “That’s just not practical.” Sick Kids Hospital is somewhat unusual in that it started its virtualization journey by focusing on storage systems rather than servers. Andreasian explained that the organization currently manages some 150 terabytes of data, which is always on the increase. Devices to handle that data, meanwhile, always end up going out of support. “We were facing the question: How do you migrate that data? It’s a huge cost,” she says, adding that no one wants to experience any downtime associated with such a migration. And all this has to happen in such a way that’s transparent to the user. The hospital has also turned to Citrix for application virtualization in order to allow remote support, which is important in a hospital situation where many clinicians may need to work from home. Sick Kids is now using VMware to deal with the more common issues around managing server fleets, such as lack of real estate, power costs, and the need to provision (that is, set up) machines more quickly. “In the physical world, if you have good planning and processes in place, that will help you with virtualization,” says Dennis Corning, HP’s worldwide senior manager of product marketing for virtualization. Andreasian agrees. “Provisioning (a virtual server) is easy. De-provisioning once the business user no longer needs it is where it’s difficult,” she says. “They might not tell you it’s no longer necessary. You need governance and monitoring and process.”
Modern (and Automatic?) Code Generation Twenty years ago, software engineer Fred Brooks famously observed that there was no silver bullet that could slay “the monster of missed schedules, blown budgets and flawed products.” Today, the creation of software might seem as expensive, trouble-prone, and difficult as ever—and yet progress is being made. Although no silver bullet is in sight, an array of new techniques promises to further boost a programmer’s productivity, at least in some application domains. The techniques span a broad spectrum of methods and results, but all are aimed at generating software automatically. Typically, they generate code from high-level, machine-readable designs or from domain-specific languages—assisted by advanced compilers—that sometimes can be used by nonprogrammers. Gordon Novak, a computer science professor at the University of Texas at Austin and a member of the school’s Laboratory for Artificial Intelligence, is working on “automatic programming”—using libraries of generic versions of programs, such as algorithms—to sort or find items in a list. Unlike traditional subroutines, which have simple but rigid interfaces and are invoked by other lines of program code, his technique works at a higher level and is therefore more flexible and easier to use. Novak’s users construct “views” that describe application data and principles and then connect the views by arrows in diagrams that show the relationships among the data. The diagrams are, in essence, very high-level flowcharts of the desired program. They get compiled in a way that customizes the stored generic algorithms for the user’s specific problem, and the result is ordinary source code such as C, C11, or Java. Novak says he was able to generate 250 lines of source code for an indexing program in 90 seconds with his system. That’s equivalent to a week of productivity for an average programmer using a traditional language. “You are describing your program at a higher level,” he says. “And what my program is saying is, ‘I can tailor the algorithm for your application for free.’ ” Douglas Smith, principal scientist at Kestrel Institute, a nonprofit computer science research firm in Palo Alto, California, is developing tools to “automate knowledge and get it into the computer.” A programmer starts with Kestrel’s Specware, which is a general-purpose, fifth-generation language that specifies a program’s functions without regard to the ultimate programming language, system architecture, algorithms, data structures, and so on. Specware draws on a library of components, but the components aren’t code. They are at a higher level and include design knowledge and principles about algorithms, data structures, and so on. Smith calls them “abstract templates.” In addition, Specware can produce proofs that the working code is “correct”— that is, that it conforms to the requirements put in by the user (which, of course, may contain errors). “Some customers want that for very-high-assurance applications, with no security flaws,” Smith says. Kestrel does work for NASA and U.S. military and security agencies. “It’s a language for writing down problem requirements, a high-level statement of what a solution should be, without saying how to solve the problem,” Smith says. “We think it’s the ultimate frontier in software engineering. It’s what systems analysts do.”
Aptara Inc.: Revolutionizing the Publishing Industry through XML The publishing industry has experienced an upheaval in the past decade or so. The “long tail” of sales of existing books via Web sellers such as Amazon and the improvement in software and hardware technologies that can replicate the experience of reading a book or magazine means publishing houses are printing and selling fewer new books. As a result, many of these companies are venturing into digital publishing. “All the publishers are shifting from print to digital,” said Dev Ganesan, president and CEO of Aptara, which specializes in content transformation. “That’s a huge change. What that means for software companies is that they need to develop platforms for content creation that meet the needs of every customer. At the same time, customers are looking at publishing in terms of handling content in terms of authors, editors, and production employees. On top of that, they’re trying to automate parts of the production process. And companies must be willing to market products using traditional and new media to reach the widest possible audience. So there are a lot of challenges, but a lot of opportunities, too.” The upshot of all this is that learning professionals can now deliver content more flexibly and at a lower cost. They can make static content dynamic by taking a body of knowledge in print—such as a book—and converting it to a digital format. They can then chunk that content into smaller sizes and organize those nuggets of information according to learners’ needs. Moreover, they can get content published and distributed much more quickly via digital, online media. This is critical in an industry such as health care, which faces rapid changes due to technological innovation and regulation, said another Aptara source. “In addition to the cost savings, they want to turn it around much faster,” he said. “Time to market is becoming paramount because there’s so much innovation going on. If they don’t have their print products out faster, they fall behind.” A breakthrough product from Aptara is called PowerXEditor (PXE). An XML-based application, PXE allows a publisher to upload an existing book layout; edit or revise all elements of the book, including text look and feel, figures, tables, and other elements unique to that book; and output the book to a paging program that sets the book up for final printing. The important issue is that all of this is done in a digital format instead of the previously common method of tear pages and cut-and-paste of figures and tables. Because the PXE content is XML-based, the application can be accessed via the Internet using any conventional Web browser. This means all of the contributors to a textbook can have access to the various chapters and elements no matter where they are.
Add in the workflow management aspects of PXE, and all phases of the textbook revising, copyediting, and proofing processes can be handled with ease. Figure 4.21 shows a typical PXE screen. You might notice that it is in the process of editing the page you are currently reading. Figure 4.22 shows the XML code for the same page.
Airbus: Flying on SAP and Web Services European aircraft builder Airbus has implemented a Web services–based travel management application from SAP as a first step in a planned group wide migration to a service-oriented architecture (SOA). The airplane manufacturer is installing the travel management component of SAP’s ERP software, mySAP, which uses SOA technology. “The new system replaces a homegrown system at the company’s plant in France, a Lotus-based system in its Spanish operations, and earlier SAP versions at facilities in Germany and the United Kingdom,” says James Westgarth, manager of travel technology procurement at Airbus. “We like the idea of an open architecture, which SOA enables,” Westgarth says. “We like the idea of being able to manage everything internally and to cherry-pick for the best solution in every class. Additional components, such as online booking, could also come from SAP—if the software vendor has a superior product for that application,” says Westgarth. The decision to deploy a new Web services–based travel management system was driven in large part by a need to reduce administration costs and improve business processes. Airbus has a travel budget of 250 million euros, which is used to help pay for more than 180,000 trips annually. The company aims to reduce costs by eliminating the current paper-based reimbursement process, which consumes time and labor, with a system that enables employees to process their own travel expenses online from their desktops or mobile devices. A key benefit for employees: Reimbursement time will be reduced to 3 days from about 10 days. In addition, the new system allows Airbus to integrate new service providers more easily into its operations, notes Westgarth. The manufacturer has outsourced its valued-added tax reclaim activities to a third party specializing in this service. With the help of application link enablers, Westgarth and his team are able to link their travel management system into the company’s other SAP applications, including finance and human resources. Airbus has a strategy to eventually migrate to the mySAP ERP across multiple systems and countries over a number of years. “The company chose travel management to pilot mySAP ERP,” says Westgarth. There have been some issues with the rollout of the travel management application, Westgarth concedes. “Because we’re the first big company to implement this technology, we’ve had difficulty finding enough skilled people on the market,” he said. “And some work was required to integrate the Web interface into our portal.” But Airbus employees, Westgarth said, like the Web-based application’s new user interface, the single sign-on and the step-by-step guidance. And the company likes the flexibility. “No one was talking about low-cost carriers five years ago,” he said. “We need to adapt to the market and to changing needs.”
Chapter 5 | Data Resource Management
Database Pioneer Rethinks the Best Way to Organize Data Is there a better way to build a data warehouse? For years, relational databases, which organize data in tables composed of vertical columns and horizontal rows, have served as the foundation of data warehouses. Now database pioneer Michael Stonebraker is promoting a different way to organize them, promising much faster response times. As a scientist at the University of California at Berkeley in the 1970s, Stonebraker was one of the original architects of the Ingres relational database, which spawned several commercial variants. A row-based system like Ingres is great for executing transactions, but a column-oriented system is a more natural fit for data warehouses, Stonebraker now says. SQL Server, Sybase, and Teradata all have rows as their central design point. Yet in data warehousing, faster performance may be gained through a column layout. Stonebraker says all types of queries on “most data warehouses” will run up to 50 times faster in a column database. The bigger the data warehouse, the greater the performance gain. Why? Data warehouses frequently store transactional data, and each transaction has many parts. Columns cut across transactions and store an element of information that is standard to each transaction, such as customer name, address, or purchase amount. A row, by comparison, may hold 20–200 different elements of a transaction. A standard relational database would retrieve all the rows that reflect, say, sales for a month, load the data into system memory, and then find all sales records and generate an average from them. The ability to focus on just the “sales” column leads to improved query performance. There is a second performance benefit in the column approach. Because columns contain similar information from each transaction, it’s possible to derive a compression scheme for the data type and then apply it throughout the column. Rows cannot be compressed as easily because the nature of the data (e.g., name, zip code, and account balance) varies from record to record. Each row would require a different compression scheme. Compressing data in columns makes for faster storage and retrieval and reduces the amount of disk required. “In every data warehouse I see, compression is a good thing,” Stonebraker says. “I expect the data warehouse market to become completely column-store based.”
AAA Missouri: Data Quality Is an Important First Step Although it may sound deceptively simple, keeping data that are both correct and clean is a major challenge for businesses of all makes and sizes. AAA Missouri, which serves Arkansas, Louisiana, Mississippi, and parts of other states (Kansas, Illinois, and Indiana) and processes about 600,000 records a year, came head-to-head against this challenge in the form of customer contact information—in particular, customer address validation. In recent years, the AAA organization has greatly expanded its offerings of products and services, going from the traditional maps and roadside assistance— although those are still there—into financial services and insurance, especially homeowners and automobile insurance. Although you might think that mailing problems brought about the need for accurate customer address information, for AAA Missouri it was necessary as a result of its expansion into the provision of homeowners insurance. AAA Missouri uses specialized software that retrieves critical property information based on the address provided; the problem is that those addresses must be formatted to U.S. Postal Service standards for the tool to work. And hence the need to work with customer addresses that are perfectly accurate and valid. After researching a number of offerings, a committee chose to adopt Melissa Data’s DQWS (Data Quality Web Service). The solution is hosted off-site by the vendor and is available 24/7 to AAA for real-time address verification. The remote hosting aspect was a major selling point, with its lack of up-front hardware, infrastructure costs, and the long-term maintenance expenses typically associated with systems developed in-house. It was “probably the top reason we went with Melissa,” says Dan Perry, a project leader within the AAA Missouri IT Department. “All of the other products required us to host the solution,” he explains. The process itself is quite straightforward. “When a user enters an address and clicks to continue,” Perry says, “we call Melissa Data to scrub the address. If there are no errors returned, we save the address to the database.” Otherwise, the customer receives an error prompt to recheck the address or, if it belongs to a new subdivision that may still not be in the database, to follow a special process to add it. All of this happens in real time with no evident performance effects due to the transmission of the address to the Web service and back to AAA’s Web site. Indeed, the entire process is completely transparent to users. Ultimately, the goal is to have every single address in the AAA database verified. In the meantime, the Web service is used in a variety of processes, including insurance underwriting, customer care, help desk, and management and reporting. DQWS is also being implemented in new services that AAA Missouri offers. For instance, as the organization added rental dwelling insurance to its offerings, the Web service ran addresses of the prospective customers. That will also be the case when it offers excess liability insurance in the future. “We decided from the beginning that we would use DQWS to validate and format all addresses entered into the system,” Perry explains.
Hadoop: Ready for the Large-Scale Data Sets of the Future Traditional business intelligence solutions can’t scale to the degree necessary in today’s data environment. One solution getting a lot of attention recently: Hadoop, an open-source product inspired by Google’s search architecture. Twenty years ago, most of the data from companies came from fundamental transaction systems: Payroll, ERP, and so on. The amounts of data seemed large, but they usually were bounded by well-understood limitations: the overall growth of the company and the growth of the general economy. For those companies that wanted to gain more insight, the related data warehousing systems reflected the structure of the underlying systems: regular data schema, smooth growth, and well-understood analysis needs. The typical business intelligence constraint was the amount of processing power that could be applied. Consequently, a great deal of effort went into the data design to restrict the amount of processing required to the available processing power. This led to the now time-honored business intelligence data warehouses: fact tables, dimension tables, and star schemas. Today, the nature of business intelligence is totally changed. Computing is far more widespread throughout the enterprise, so many more systems are generating data. Companies are on the Internet, generating huge torrents of unstructured data: searches, click-streams, interactions, and the like. And it’s much harder—if not impossible—to forecast what kinds of analytics a company might want to pursue. Today it might be click-stream patterns through the company Web site. Tomorrow it might be cross-correlating external blog postings with order patterns. The day after it might be something completely different. And the system bottleneck has shifted. In the past, the problem was how much processing power was available, but today the problem is how much data need to be analyzed. At Internet-scale, a company might be dealing with dozens or hundreds of terabytes. At that size, the number of drives required to hold the data guarantees frequent drive failures, but attempting to centralize the data imposes too much network traffic to conveniently migrate data to processors. Hadoop is an open-source product inspired by Google’s search architecture. Interestingly, unlike previous open-source products that were usually implementations of previously existing proprietary products, Hadoop has no proprietary predecessor. The innovation in this aspect of big data resides in the open-source community, not in a private company. Hadoop creates a pool of computers, each with a special Hadoop file system. A central master Hadoop node spreads data across each machine in a file structure designed for large block data reads and writes. It uses a clever hash algorithm to cluster data elements that are similar, making processing data sets extremely efficient. For robustness, three copies of all data are kept to ensure that hardware failures do not halt processing. The advantage of this approach is that very large sets of data can be managed and processed in parallel across the machine pool managed by Hadoop. The power of Hadoop is clear from the way The New York Times used it to convert a 4-terabyte collection of its pages from one format to another.
Coty: Using Real- Time Analytics to Track Demand In the perfume business, new products like the recent launch of Kate, a fragrance
Coty branded for supermodel Kate Moss, can make or break a company’s year. But big hits can also lead to big problems. When a product takes off, Coty must respond quickly to keep shelves full, but its ability to ramp up is dependent on glass, packaging, and other suppliers. “If we can’t meet demand . . . it annoys the retailers, the consumers lose interest, and we lose sales,” says Dave Berry, chief information officer at Coty, whose other brands include Jennifer Lopez, Kenneth Cole, and Vera Wang. Empty shelves are the scourge of manufacturing and retail. Just look at the annual shortages of the Christmas season’s hottest toys or at the rain checks stores must write regularly on sale items. At any given time, 7 percent of all U.S. retail products are out of stock; goods on promotion are out of stock more than 15 percent of the time. That’s why manufacturers and retailers are pushing for the next breakthroughs in demand forecasting, what has emerged as the discipline of “demand-signal management.” Instead of just relying on internal data such as order and shipment records, manufacturers are analyzing weekly and even daily point-of-sale data from retailers so that they can better see what’s selling where. This sort of timely, detailed data lets manufacturers spot trends much sooner by region, product, retailer, and even by individual store. Handling demand-signal data presents the same problems that real-time data causes in any industry: how to access and integrate high volumes of data, and then combine and analyze it alongside historical information. With the advent of highly scalable data warehouses, low-latency integration techniques, and faster, deeper query and analysis capabilities, the technology is finally here, at a price most can afford. And with easier-to-use business intelligence tools, manufacturers and retailers are pushing analytic tools into the hands of frontline decision makers, most often field sales and marketing people involved in planning, merchandising, and supply chain management. Over the last two years, Coty has pushed the responsibility for developing accurate forecasts down to its salespeople. Field-level forecasting makes for more accurate and responsive planning, says CIO Berry, who credits an analytics application from vendor CAS with making it easier for salespeople who are new to business intelligence to analyze point-of-sale data and develop forecasts. An important obstacle to broad adoption of demand-signal analysis has been the lack of standardization in the data supplied by retailers. Coty gets point-of-sale data from the likes of CVS, Target, and Walgreens, but each uses a different format. “The timeliness, accuracy, and depth of the data also varies from retailer to retailer, so it’s tough to bring it into a data warehouse,” says Berry. That being said, the payoff from early efforts by Coty has been more accurate forecasting, higher on-shelf availability, and more effective promotions. With faster and more detailed insight into demand, manufacturers can ratchet up revenue by 2 percent to 7 percent, which more than justifies any data-related headaches.
Better Analytics Means Better Care Southeast Texas Medical Associates (SETMA) is a midsize practice in Beaumont,
Texas. In 2010, SETMA reduced its hospital admission rate by 22 percent, and the number of visits by diabetic patients around the holidays—a difficult time—by about 15 percent. How? By keeping their patients healthier. How? By using business analytics to improve the quality of their health care. Two years ago, SETMA—already an early adopter of NextGen EMR, a system that is used to automate medical practice workflows—implemented software by Cognos, part of IBM, to better analyze data gathered from records of past patient visits and incidents. As a result of these data gathering and consolidating efforts, SETMA now operates a data warehouse that includes records for all of their 65,000 patients. The Cognos implementation included modules used to analyze trends in the data, which are directly used by the doctors and not by middlemen IT specialists. Using the software,
SETMA management can better understand the quality of care provided to patients and ensure that doctors adhere to best practices, as outlined by a number of professional organizations, such as the National Committee for Quality Assurance. “Health care is behind the times in the use of BI,” says SETMA CEO James Holly. “But everything that BI can do in other industries, it’s doing for us at Southeast Texas Medical Associates, leveraging data to improve care.” SETMA has been able to improve and streamline its operations in many areas as a result of the new tools. Some of this is simply the result of better data about patients. For example, support staff can now run reports about patients before they come in for an appointment. This allows them to identify any missing or overdue tests and arrange for these tests to be taken care of before the patient sees the doctor; then the doctor and patient can go over the test results together. Other analyses are more sophisticated. Using the analytics functionality provided by Cognos, SETMA compared patients who are readmitted to a hospital after being discharged with those who are not, and used the results to identify how those two groups of patients are different in terms of ethnicity, age, gender, income, and follow-up care. This analysis revealed that patients who live alone are less likely to keep their prescribed medication regimen, and it allowed SETMA to develop improved treatment plans to make sure that did not happen. Although some of these analyses and reports were theoretically possible before SETMA implemented Cognos, their legacy systems would take 36 hours to provide this kind of information, and with less sophisticated analysis of the results. The practice, then, would always be one day and a half behind the most current information about its patients—and a lot of things can happen to a patient’s health in that period of time. With Cognos and the new data warehouse, those reports take a few seconds to run. “SETMA spent about $500,000 on the Cognos project,” says Holly. “It was expensive, but the payoffs are enormous, and we’re just scratching the surface,” he says.
Online Dating: The Technology behind Finding Love When Joe wanted to find love, he turned to science. Rather than hang out in bars or hope that random dates worked out, the 34-year-old aerospace engineer signed up for eHarmony.com, an online dating service that uses detailed profiles, proprietary matching algorithms, and a tightly controlled communications process to help people find their perfect soul mate. Over a three-month period, Joe found 500 people who appeared to fit his criteria. He initiated contact with 100 of them, corresponded with 50, and dated 3 before finding the right match. The “scientific” matching services, such as eHarmony, PerfectMatch, and Chemistry.com, attempt to identify the most compatible matches for the user by asking anywhere from a few dozen to several hundred questions. The services then assemble a personality profile and use that against an algorithm that ranks users within a set of predefined categories; from there, the system produces a list of appropriate matches. The technology that powers these dating sites ranges from incredibly simple to incredibly complicated. Unsurprisingly, eHarmony has one of the most sophisticated data centers. “The company stores 4 terabytes of data on some 20 million registered users, each of whom has filled out a 400-question psychological profile,” says Joseph Essas, vice president of technology at eHarmony. The company uses proprietary algorithms to score that data against 29 “dimensions of compatibility”—such as values, personality style, attitudes, and interests—and match up customers with the best possible prospects for a long-term relationship. A giant Oracle 10g database spits out a few preliminary candidates immediately after a user signs up, to prime the pump, but the real matching work happens later, after eHarmony’s system scores and matches up answers to hundreds of questions from thousands of users. The process requires just under 1 billion calculations that are processed in a giant batch operation each day. These operations execute in parallel on hundreds of computers and are orchestrated using software written to the open-source Hadoop software platform. Once matches are sent to users, the users’ actions and outcomes are fed back into the model for the next day’s calculations. For example, if a customer clicked on many matches that were at the outset of his or her geographical range—say, 25 miles away—the system would assume distance wasn’t a deal-breaker and next offer more matches that were just a bit farther away. “Our biggest challenge is the amount of data that we have to constantly score, move, apply, and serve to people, and that is fluid,” Essas says. To that end, the architecture is designed to scale quickly to meet growth and demand peaks around major holidays. The highest demand comes just before Valentine’s Day. “Our demand doubles, if not quadruples.”
Chapter 6 | Telecommunications and Networks
Telepresence: GE Does Training and Meetings Face-to- Face, but Virtually GE’s former CEO and Chairman Jack Welch famously said “The desire, and the ability, of an organization to continuously learn from any source, anywhere, and to rapidly convert this learning into action, is its ultimate competitive advantage.” Although he is now retired, the emphasis on education and training that Welch instilled into General Electric—one of the largest and most valuable companies in the world—still exists today, as strong as ever. GE invests about $1.2 billion every year in training, centered on its Crotonville, New York, facility, aptly dedicated to Jack Welch in commemoration of his retirement. Although some types of training can be done individually with prepared materials delivered through computers (i.e., computer-based training), there is an important aspect of training and shared organizational culture that requires people to meet other people in the process. In 2005, however, as part of the effort to reduce its carbon footprint, GE embarked on a project to reduce corporate travel. The big challenge, then, was how to keep facilitating meetings and training while at the same time limiting the need for people to travel to Crotonville. In addition, flying executives around for meetings and training was quite expensive. GE estimates that an executive flying on a round trip to Asia for a two-hour meeting would cost $30,000, two days lost in transit, and almost 7,000 pounds of carbon dioxide emissions. With the dual goals of replicating face-to-face meetings while at the same time saving money and limiting the environmental impacts of training, GE choose to pilot telepresence technology from Cisco. This technology uses high-definition video to create lifelike meetings. Although it is not strictly part of the technology, meeting rooms are often identically set up across the different locations—same wall color, same furniture, and so forth—to create the impression that all participants are sitting in the same room, when they might be two continents away. Although the typical telepresence room seats about 10 or 15 people in each location, GE chose to implement the technology in a large conference room with stadium-style seating and capacity for 60. “This Telepresence room is the first of its kind,” says Tim Hennen, senior vice president of AV integration with IVCi, the company that implements the technology. “Never before has an integrated Telepresence room been created that can deliver an optimal experience to over 18 participants, and this room’s capacity far exceeds that number.” GE has already started to use the room for training and meetings, and the experience has far exceeded their expectations. In fact, the ability to conduct any business meeting face-to-face has created a more intimate sense of belonging and connection, which has resulted in improved productivity. To underscore the flexibility provided by the facility, GE held its corporate executive council meeting—where all executives responsible for international operations report on their results—using telepresence instead of flying all of them to a common location, with many obvious benefits in time and money. “There was no negative impact to the participants who joined the meeting remotely,” says Timothy Peterson, GE’s lead desk-side support. “The executives were able to do anything they would have done if everyone had been present in the same room.”
Intranet Dashboard Revs Up Audi Australia Audi is a brand synonymous with sporty, progressive, and sophisticated cars that embody technological perfection. On the back of the company’s year-on-year record growth since 2004—including 30 percent growth in Australia in 2008—the company needed to position itself, and its national dealer network, to manage its future growth. Audi Australia has a network of 30 dealerships across Australia. It needed to communicate with a range of people within its dealer network and ensure that different roles within the dealership were given access to the right information. There was a complex network of stakeholders who required access: the solution needed to cater to 500 users who were broken into 90 different user groups. Audi has an existing portal solution that has been built on an open-source solution. Audi’s business had outgrown this solution, which had become unreliable and required a lot of technical management. Audi only has one in-house IT staff member, and it needed a solution that could be administered and maintained by nontechnical staff, without intervention from a third-party supplier. “The old portal wasn’t letting us provide all the information we wanted to the dealers. We just couldn’t update it frequently enough,” says Wolf-Christian Vaross, IT Specialist for Audi Australia. “Administration of the old site wasn’t easy—to make changes we had to get a programmer to do it. The software might have been free initially but we didn’t have the expertise to support it in house, and we didn’t want to keep paying someone outside the company to maintain it.” Audi chose the iD solution because it was able to deliver all the features they required out of the box. Another important component of the project was that the dealer portal had to meet Audi’s scrupulous design standards to match Audi’s distinctive branding. As part of the implementation, Audi involved the general managers from across five key departments including sales, corporate communications, and finance to find out what information they needed to share with dealers. This ensured that the broader business would be involved in creating and maintaining the dealer portal and had buy-in to the project. “The preparation process that iD took us through made it easy—they gave us an understanding of how to structure it,” says Vaross. “Now when someone from a dealership logs in, they’ll see the latest news relevant to them, and it will only take one mouse click for them to find what they’re looking for. It was important to give them the easiest possible route to the information they need,” adds Vaross. Audi’s dealer portal was launched on February 1, 2009, and enjoyed rapid uptake by its dealer users. The number of users increased by 450 percent in the second month of use. “As users have discovered that the new portal is easy to use and offers relevant information, they are already beginning to access more information via the portal,” says Vaross. “We have seen the number of pages they visit increase by 300 percent in the second month of operation.”
The NFL Scores with New Extranet The National Football League (NFL) is the most popular sports league in the United States, with the highest per-game attendance numbers of any domestic sport league worldwide. Composed of 32 teams that compete to win the Super Bowl, the league is also well-known for its excellent management and organizational skills; indeed, it has been called “one of America’s best-run businesses” by BusinessWeek magazine. Although each team is a separate entity, the league provides central services geared to the production and promotion of the sport, as well as supporting individual teams and other organizations involved in the process, such as television broadcasters, game officials, fans, and the media. In 1997 the NFL was the first major sports league to implement a media-only Web site that could be used to distribute information—team and player statistics, player careers, attendance numbers, schedules, and so forth—to the various journalists and publications that cover the sport, both in the United States and worldwide. “We have a very good relationship with our media and are proud of the services we provide for them. The media portal is an extension of that, which is why we want to keep moving forward and making it the best resource we can,” says Leslie Hammond, Director of Media Services for the National Football League. By 2008, however, the site had become dated and new capabilities were required. The NFL decided to rebuild its media site as a custom portal, which could be used to provide all necessary information in a secure manner and in a way that best meets the needs of the media organizations. With that in mind, the league conducted extensive interviews and surveys with the media to identify which features were deemed most necessary to be included in the new site. As a result of this process, the new portal was organized around specific “pages” that contained information the media needed about specific aspects of the sport. For example, there is a separate Team Page for each team, which consolidates all team-related information, and is most useful for those who exclusively cover one specific team. Then there are Release Pages which contain leaguewide information and press releases, and Game Day Pages which include all information necessary for the coverage of a specific game. Behind the scenes, the consolidation, editing, production, and publication of content is managed by work-flow tools by IBM Workplace Web Content Management, which is tightly integrated with WebSphere Portal, the underlying technology hosting the portal. “We decided in favor of IBM WebSphere Portal to take advantage of the years of experience behind it and the number of companies that can support it,” says Joe Manto, vice president of Business Services and User Support for the National Football League. Since it was first launched, the portal has proved to be a success with both NFL personnel, as well as the media. A number of NFL public relations staff travel constantly, and this allows them to upload and edit content remotely. On the media side, the portal provides customized content that puts all necessary information about a team or game at their fingertips and in a single place, which eliminates the need to navigate to multiple places to get all necessary content. In addition, they can subscribe to receive any news or new content related to teams or games they are covering.
Wireless VPNs: Alternatives for Secure Remote Access Road warriors wirelessly connect to the corporate network from hot spots at airports or coffee outlets. Just a few years ago, common nightmare stories were told of even casual bystanders being able to eavesdrop on corporate communications made in such circumstances. As a result, there’s a widespread acceptance that VPNs are pretty much de rigueur for wireless use on the road. Fast-growing, New York–based Castle Brands uses a PPTP-based VPN— having first weighed open-source and proprietary VPNs. “We tried to keep the cost down, without compromising security,” says Director of IT Andre Preoteasa. “Throw in the up-front cost of some VPNs, the additional hardware, license fees, and yearly support costs, and costs soon climb. With PPTP, if you’ve got Windows XP, you pretty much have it.” Initial access to the network is password-based, explains Preoteasa, with subsequent access control following role-based rules maintained on the server in the form of Microsoft Active Directory. “People can’t just go anywhere and open up anything; the accounting guys get accounting access while the sales guys don’t,” he says. At London-based law firm Lawrence Graham, a combination of tokenless, two-factor authentication techniques help ensure secure remote VPN wireless access, says the firm’s IT director Jason Petrucci. “When lawyers log on to the system remotely from a laptop, they are presented with three authentication boxes: one for their username, one for their log-on password, and the last for their combined personal PIN code and passcode,” he says. “SecurEnvoy is used to manage and deliver this passcode by preloading three onetime passcodes within a text message, which is delivered to the user’s BlackBerry.” As passcodes are used, replacements are automatically sent to each lawyer’s BlackBerry. “Our lawyers carry BlackBerrys with them wherever they go. A physical token inevitably runs the risk of being left behind or lost altogether.” Meanwhile, at Fortune 50 insurance company MetLife, protecting against data leakage—especially with respect to client information—is of paramount importance when enabling remote wireless access, says Jesus Montano, assistant vice president of enterprise security. “The challenge is balancing people’s access requirements with our overall security requirements, and then working with them to find ways of creating an effective solution without compromising security,” he says. For wireless access from airports and coffee outlets, he explains, these days that means access via VPN vendor Check Point, solely from MetLife-owned laptops, with log-ons protected by RSA “hard token”–based, two-factor authentication. In addition to the encryption built into the VPN, all the data on the laptop are protected, he adds. “All wireless traffic is encrypted; the devices are encrypted and wrapped around with a firewall,” stresses Montano. “We think we’ve addressed the most obvious pitfalls in remote access, and think we’ve got a robust, highly engineered solution.”
View from Space: Satellite Farming for Greener Pastures Making the most of natural resources of farms is critical in today’s environment, where rainfalls are becoming ever so scarce. Although in Queensland the use of animal recognition technology is being used to conserve water, on the other side of the country, in Western Australia, satellite technology is providing farmers with a suite of tools to accurately estimate the amount of feed in their pastures, how quickly their pastures are growing, and the pasture quality. For maximum efficiency on a farm, farmers need to use the pasture when it is at its best. According to Gonzalo Mata, who is in charge of farming systems and Web development for the project, the general rule of thumb is that about only 20 to 30 percent of pasture grown is utilized in many beef and sheep production systems. “Farmers need this information in order to match the animals’ nutrient demands for growth and reproduction with the supply of feed, which can be very seasonal. If this is not achieved, production is lower or costs increase through the use of supplements to achieve the balance,” Mata adds. According to Mata, you can’t manage what you can’t measure, hence the need to allow farmers to measure how much pasture there is on their farms. The tool uses images from a NASA satellite to create a composite greenness index. The climate data are sourced from the Bureau of Meteorology on a weekly basis, and the two data sources are combined in a pasture growth model. Pastures from Space boasts a 97 percent accuracy, and it is possible for farmers with a subscription to have sustainable pasture utilization of more than 50 percent. “Building data over time allows the farmer to do comparisons for specific areas between seasons or between years, which can be a powerful tool to benchmark production and manage risk,” says Mata. By going online, the farmer can also look at maps of pasture growth grate (PGR) for their farm, giving them a better understanding of why some parts of the farm are performing better than others.
Around the World: Mobile Buying and Banking In 2009, many U.S. consumers whipped out their Smart phones in brick-and-mortar stores to find better deals online, tripling mobile shopping revenue in just one year. The relationship between money and mobile devices, however, varies widely from one part of the world to another. Mobile banking grew significantly in India, while Africa, Latin America, and some other parts of the world appeared ready to bypass banking altogether in favor of payments handled by mobile operators. “Mobile commerce grew far faster in the U.S. than worldwide, vaulting from US$396 million in 2008 to an estimated $1.2 billion in 2009,” says analyst Mark Beccue of ABI Research. Drawing on information from multiple sources, ABI concluded that many Smartphone users went shopping in physical stores, looked at products, checked out deals on the same items online and made a purchase without even going home to log onto a computer. Shopping on the mobile Web has become especially popular in North America, though in Japan, it already accounts for about 20 percent of online purchases, says Beccue. Worldwide, excluding Japan, mobile commerce grew from about $3 billion in 2008 to $4.4 billion in 2009. Meanwhile, the number of U.S. consumers using mobile banking more than doubled in 2009, from 4 million to 10 million. “That was partly driven by the slumping economy, because many consumers adopt mobile banking to check their balances frequently,” says Beccue. In addition, U.S. banks are starting to treat mobile as more than an extension of Web-based banking, with tools such as SMS (Short Message Service). But mobile banking is most popular in Asia and has made particular gains in India, where much of the country has limited banking infrastructure. Looking to remedy this problem, the Indian government in 2008 started encouraging banks to launch mobile platforms, he says. “They see mobile banking as a way to accelerate the acceptance of personal financial services,” Beccue says. More than half of Asia’s mobile banking customers in 2009 were in India. Worldwide, the number of mobile banking users grew from 24.4 million in 2008 to 52.1 million in 2009. Half of those users were in the Asia-Pacific region. ABI expects to see 407 million people worldwide use mobile banking by 2015. But by that time, nearly as many people will be handling their money through their phones without ever opening a bank account. By then, approximately 405 million people will be using point-to-point payment systems in which the mobile operator takes in and pays out the cash, Beccue says. Point-to-point payment systems are becoming an important financial platform in countries where most people have never had access to banks. “In many parts of the developing world, mobile is the most common piece of infrastructure that exists,” says Beccue. “In many places, there are more mobile phones than there is running water or electricity.”
Ottawa Regional Hospital: Lowering Costs while Converting to VoIP What started out as an upgrade to the phone system at Ottawa Regional Hospital and Healthcare Center became a badly needed network overhaul that lowered costs and included a conversion to VoIP. The Ottawa, Illinois–based center was running an analog phone system that wouldn’t support an IP phone system, let alone the battery of high-bandwidth medical applications that are becoming more and more necessary, says Curt Sesto, director of facilities, construction management, and electronics for the center. When he arrived in 2008, his marching order from the CEO was to get a new phone system right away. “It had been on his radar for a couple of years,” he says. One goal was to get rid of the estimated $28,000-per-year maintenance cost of the PBXs, for which it was getting increasingly harder to find parts as they grew older. “They could go toes-up at any time,” he says. Sesto checked out Siemens, Cisco, and Avaya VoIP systems. The Siemens system was being pushed by PosTrack, which also supplies Siemens medical gear to Ottawa Health. It was the only bidder that urged a data network evaluation as the first step in the process of moving to VoIP, he says. He liked that and also the fact that the Siemens offer was a hosted service. It would take on the task of network monitoring and maintenance, which frees up two to three full-timers who can focus instead on implementing electronic medical records systems, another priority for the center. Voice traffic will run over the same network. The voice system is based on Siemens Open Scape servers located at two separate sites in Chicago for redundancy in case one goes down. It consists of a 30-mile connection over the local Medicacom cable TV network to the state-run Illinois Century Network, which is available to hospitals to connect to their local facilities. The new phone system can be extended to 15 medical office buildings that are off the Ottawa campus, Sesto says. The old PBXs couldn’t handle them, so each had its own small Avaya PBX that is being decommissioned as the central VoIP service rollout reaches each building. The VoIP system has given the center a new voicemail system that integrates with Outlook so users get e-mail notification of voice messages. They system can also ring more than one phone when an extension is called. So an individual could configure the system to ring the office phone, but also the home phone and mobile. Unified communications (UC) features in the system include faxing to e-mails. The network overhaul was more extensive than the CEO had in mind when he asked for a new phone system, but it’s more appropriate to the high bandwidth medical applications the network needs to support, Sesto says. “The old network was like having bicycle tires on an Indy car,” he says.