Cloud computing is simply the consolidation and abstraction of previous disparate computing infrastructure. In the past computing infrastructure and servers for web applications, web sites, databases, and more had to be purchased and managed in an in-house data center. This places a heavy burden the upfront cost of IT projects. Cloud computing changes this and puts the control back in the hands of creative innovators.
Cloud computing was initially pioneered by internet retailing giant Amazon.com. In 2006 they started selling spare computing cycles in their data-centers with the launch of Amazon Web Services. This innovation has allowed start-ups to rapidly prototype and deploy ideas to the web by essentially renting computing time in a cloud provider’s data-center. By only paying for what is needed new ideas and products could be quickly scaled if successful or abandoned if not. Managers may leverage iterative operational feedback without the upfront investment in costly IT infrastructure. Economies of scale is leverage on the energy consumption and sustainability front. As usage changes resources may be elastically scaled based on need. Applications based in India may use the same infrastructure used in America while the later sleeps, and vice versa.
There are many advantages to cloud computing, but it is no panacea. The overall complexity of an IT system needs to be taken into account before a decision is made to move it up to a cloud provider. Depending on the industry regulations may need to be considered with respect to IT infrastructure. To save money it’s important to ensure that you have done a complete total cost comparison. Cloud computing is not always cheaper. Consideration of in-house IT staff resources should be considered to avoid stress related to redundancies. An incremental or piece-meal migration strategy may be the best option for most to avoid costly implementation mistakes. Leveraging a competent IT department is the key to success when formulating and implementing a cloud strategy. Tread carefully and wisely to succeed.
Over the past few decades there has been slow and steady merging of our physical world with the Internet. The miniaturization of computers and the proliferation of wireless connectivity has enabled the installation of microchips and sensors into more and more things. Analysts call this phenomenon the Internet of Things (IoT). Our phones have been transformed from a simple device to audibly communicate with each other into an automatic sensing device that records the who, what, where, and when of our daily lives. Jet engines now have tiny computers and sensors that detect and diagnose minor problems to be fixed avoiding costly loss of equipment or lives in flight. The Internet of Things is freeing up the human brain to focus on bigger problems and saving money in the process.
Real-time decision making
Data storage overhead
Reliance on technology
Impact on Business
The evolution of IoT will be highly disruptive to business and its competitive landscape. More and more data will be available providing insight into how a business really operates on an atomic level. Data-driven decisions will usurp human intuition and ultimately a manager’s “gut feeling”. Previously manual business processes will become more and more automated. This may lead to job loss in the short term, but that will likely be offset by job gains in IoT related sectors. Energy consumption will decrease as process are made more energy efficient. Overall management’s decision-making toolkit will broaden and become more effective.
Successful managers need to be aware of potential applications in their business. Previously unknown opportunities will surface, the ability to see the forest through the trees is important. Building out IoT infrastructure may be expensive in the near-term so a long-term outlook is key. It is important for managers to seize the day now before their competitors beat them to it. It’s only a matter of time, get on it people!
Do you have business applications in your organization that users under your charge complain about using or don’t use at all? Why would you, they’re boring. This is where gamification comes in. Gamification is the use of game thinking and mechanics on non-game applications to engage users. Put simply it’s a useful way of making people do things they would rather not do by making them fun.
A simple example is filling out an expense report. Why not link completing expense reports promptly and accurately to a scoring system that is available to other coworkers. Reward systems common the video game world such as achievement badges and scores can be applied to saving per diem money, getting a deal on a hotel room, or submitting a report promptly after returning from a trip. This could ultimately save the organization money and increase employee participation. These scores could be kept on a leader board and rewards given to the top performers.
Impact on Business
Like many trends in business gamification is a double edged sword. In one situation it could work great and in another it could fail miserably. It has been estimated that by 2014 that 80 percent of current gamified applications will fail to meet the business goals they were developed for. The primary reason for this outcome is poor design. It’s important to get it right the first time and be ready to scrap a gamification idea that don’t perform well. Those developing the application must be creative and have a solid understanding of game theory.
Gamification is new and should be approached careful and intentionally before committing resources to a project. An 80% failure rate indicates these nascent techniques have some maturing to do. In time techniques will mature, best practices surface, and patterns develop. The idea is grounded in human psychology which is not subject to fads or trends. You have time to wait and see, use it.
Woo hoo! The semester is over! A professor asked me once what I learned in MIST 7500. This is an interesting question for me. I came into the Master of Internet Technology program with over a decade of experience in internet technology. I didn’t expect to learn a lot and I was disappointed. It turns out I actually did learn a lot, not about technology though, rather about how technology is applied and used in business. I learned these lessons from the professor and a few guest speakers sprinkled throughout the semester. I found that this was the key insight that I didn’t know I needed.
The application of internet technology in business was the focus of the first several classes. We learned about the The Business Model Canvas, SWOT analysis, and Porter’s 5 Forces Model, among others. These abstractions of business serve to simplify and focus what an organization is all about providing managers and stakeholders with a framework to make decisions. Applying these analytic methods to my work has enabled me to focus my limited time and effort to the most effective part of the problems I am presented with.
The most notable guest speaker in my view was Colleen Jones. Her focus is content strategy, a nascent specialty that leverages analytic methods and scientific research to determine the most effective strategy to leverage web content to achieve an organization’s goals. In the 90s and 00s the focus of web strategy was simply to have a presence on the web. Recently though innovation has abstracted many of the technological hurdles away clearing the way for Colleen and her analytical methods. With her book Clout: the Art and Science of Influential Web Content (reviewed here on this blog) and her consultancy Content Science she has become one of the foremost thought leaders in Content Strategy.
Another speaker I thought was interesting was Jason Lannen who discussed auditing and controls in IT. Working in the government this hit close to home. Almost everything I deal with on daily basis has to do with some level of auditing and controls. Hearing how this is handled in the private sector provided me with insights enabling me to contrast my experience with his. This made me a lot less frustrated with many of the challenges I face at work.
I recently read a research paper by Boris Fritscher and Yves Pigneur titled Business IT Alignment from Business Model to Enterprise Architecture. It descibes how to translate a business model into an enterprise architecture model such as ArchiMate. This model consists of three layers; Business, Application, and Technology. Using their technique the same architecture may be applied to my fictional LEGO builder’s web community website mentioned in a previous post.
At the bottom of the Archimate model is IT Infrastructure layer consisting only of Key Resources (6) from the BMO. Obviously this would include cloud application hosting. The application would be maintained by a development team running on a cloud application administers by an IT team, or perhaps the developers themselves depending on skills. Community managers would not be part of this layer. Perhaps this is redundant and can be removed from that part of the BMO.
Sandwiched in the middle of the Archimate model is the Application layer including from the BMO model Key Activites (7). The lower IT layer enables the middling application layer including community management and website administration consisting of development and support. This could also include social media and other communication tools manage relationships with partners and customers identified above it in the Archimate model.
The top layer of the ArchiMate model is the Business layer. In that layer on top are external roles and actors which include from the BMO Key Partners (5) and Customer Segments (1) consisting of Bricklink (unofficial LEGO marketplace), The LEGO Group, and our core audience; LEGO builders. To support them there are external business services which include items from the Value Proposition (2), Customer Relationships (4), and Channels (3). In our case this would consist of direct brick sales via Bricklink or LEGO Group and the user community data and contributions. Users would create their content using tools on the website and build them using bricks either purchased or out of their collection using parts lists generated by the website.
Last week Chris Hanks from the Terry College joined us to discuss Entrepreneurship. According to Hanks the primary reason people start their own business is because they hate their boss. Another reason people start a businesses is so they can do what they love to do. These two reasons resonated with me. I have always wanted to quit my job and start my own business. The problem is I don’t hate my boss, my co-workers rule, and I love what I do. Based on my experience and conversations with friends it appear that my situation unusual. This is quite a conundrum.
Working for someone as opposed to working for yourself means that you don’t reap the rewards of your work in the same way. When you are the boss you take the monetary risk and reap the rewards. These rewards can be huge…or not. In my case I work for CDC and my job is relatively secure compared to similar jobs in the private sector. This perceived security coupled with my work environment are a massive hurdle to the possibility of making massive money. How do you value a good work environment? What if I quit my job, fail, and end up in a job that sucks!? Ugh.
I think a third way for me may be not to quit my job. My job affords me the work/life balance to work in my off time on a small business. A small business on the side allows me to supplement my income with other income streams. This augments my job and makes and enhances an already secure situation. If something bad happens with the business, meh, I still get paid every two weeks. The question for you though is what should you do? Well, that’s up to you.
I just watched three fascinating videos about the future of the Internet (embedded below). They present differing views for how the Internet (and the Web) are going to evolve. Vint Cerf discusses the future from an infrastructure perspective. He proposes several ideas that make the Internet more robust such as multihoming, Internet broadcast augmentation, sensor networks, mesh networks, bit-rot over time, and an interplanetary Internet. Tim Berners-Lee explores the future from the perspective of the World Wide Web by making raw data available through Linked Data. Finally Kevin Kelly speaks about the Internet as an Internet of things that are fully interconnected and communicating thus creating a super-organism.
Their talks were very similar in that they all discussed the internet becoming more mobile. This will be enabled by mesh networks that can sense other devices and multihoming networks that enabling users to remain unique while their service provider changes. Users will be able to query data related to their current location and update the data when it is lacking. Those data will be aggregated and organized by more and more sophisticated applications and provided to end users. Indeed we are already in a world where no one knows phone numbers anymore and arguments in bars are settled in seconds via a Google search on a smartphone. We moving towards a world of co-dependence with our machines that will ultimately bring us closer together, make us more productive, and realize a high quality of life through interconnection that never before experienced by humans.
I thought their talks were lacking in the areas of augmented reality and nanotechnology. They touched on it from a data and connection perspective though. Advances in brain-computer interfaces are advancing rapidly. I foresee a day when we have a tiny machines in our body sensing our vitals signs, killing cancer cells, and maintaining the integrity of our DNA. They would use mesh networks to communicate with one another. An interface to a computer embedded in our brain could process this information and communicate it to our visual cortex or perhaps our thoughts. That may sound like true science fiction but do you know what the one thing almost no science fiction writer ever envisioned? It’s the Internet. Perhaps real life is in fact stranger than fiction.
I recently watched a fascinating video (see below) that presents the history of the Internet from 1958 to present. There were several important technological events mentioned that led to the Internet of today. Below are five that I think are the most important.
In the 1958 the United States set up Defense Advanced Research Projects Agency (DARPA) response to the Sputnik launch by the Soviet Union. The mission is to prevent strategic surprise and ensure the United States stayed at the forefront of military technology. This mission required the agency to coordinate research conducted by institutions throughout the country. It supported the evolution of the ARPAnet in order to prevent duplication of research and to speed development. The ARPAnet was the foundation that modern Internet is built on.
Packet Switching is an innovation important to the early ARPAnet for it made this shared networks more efficient. Packet Switching networks are designed to divide data into equally sized packets before transmitting those data over a network. These smaller packets of data allow networking hardware with limited computing resources to queue and buffer data was it’s transmitted over nodes. The ARPAnet could then effectively manage network capacity allowing it to grow into the Internet we use today.
In the early 1970’s the French developed a research packet switching network called CYCLADES. It was the first network to make hosts, rather than the network, responsible for the reliability data delivery. This greatly reduced the complexity of packet switches and proved that host-based responsibility was indeed workable. Though CYCLADES was shutdown by the French in 1981 the lessons learned were incorporated into the OSI model.
The X.25 protocol was developed by the ITU Telecommunication Standardization Sector (ITU-T) to provide access to the Internet via telephone company connections. This had the benefit of lower costs to users all over the world because they could connect using installed phone systems. It was developed before the OSI model and consists of three layers. These layers are similar though to the three lowest levels of the OSI model’s seven layers. In my opinion the most significant impact of X.25 was the ability for it to provide users in the 1980’s access to Internet without requiring costly new infrastructure.
The Transmission Control Protocol (TCP) and Internet Protocol (IP) were developed and integrated into the ARPAnet between 1973 and 1983. The US Department of Defense was the first to declare it as the standard for all military computer networking in March 1982. Its adoption exploded in 1989 when AT&T agreed to place code developed for UNIX into the public domain. Software companies such IBM and Microsoft incorporated this code into the TCP/IP stacks for operating systems into the 1990s thus standardizing the Internet based on TCP/IP.
Still interested? Check out this video for an overview the history of the Internet.
In business executives and managers have a responsibility to ensure that their venture stays viable and growing. This requires them analyze their competitors, customers, and market forces to guide their decisions. Patterns are identified to develop abstract strategies and models that are applicable across domains. Since the 1970s academics have developed and refined several techniques to models business.
The first of these techniques is the SWOT (Strengths, Weaknesses, Opportunities, and Threats). SWOT spawned a variant called TOWS. SWOT and TOWS were not sufficient though so the Porter Five Forces Analysis was developed leveraging industrial organization economics to determine the five forces that compose a market’s competitive intensity. The five forces are threat of new entrants, bargaining power of suppliers, bargaining power of buyers, industrial rivalry, and threat of substitute products or services. Executives use insights gleaned to decide if a market is worth entering or exiting. A sixth force was later added to address complementary competitors and government. A final higher level analysis is the PEST analysis that describes political, economic, social, and technological factors. These factors effect entire industries help managers with macro-level strategic decisions.
Indeed, while these techniques are no panacea, they do help illuminate the competitive landscape to enable decision makers to deliver the maximum value to their stakeholders. These models fail though when they are based on flawed information. Managers must be sure models are based on well researched facts and educated assumptions to have the best chance of a successful execution.