Write my Essay 1, compare and contrast your three most important enterprise technology trends for 2017

Reading Appendix 1, compare and contrast your three most important enterprise technology trends for 2017.Reading Appendix 1, compare and contrast your three most important enterprise technology trends for 2017.

 

Draw out the critical principles and issues in each trend, then compare-contrast using say, your advantage-disadvantage or SWOT model, as you see fit.Remember – in learning, you will actually be looking to interpret why you rate the three as important, and also (maybe) why you would rank them 1, then 2, then 3 –
giving them some priority and focus – as a future CIO executive – you will need to make these choices and apply budgets and resources. Appendix 15 enterprise technologies that will shake things up in 2017Triple A security, the Internet of Things and AR/VR to make their marksNetwork World | Dec 19, 2016 2:38 AM PT  The Internet of Things – for realYes, yes, we know – it’s one of those long-standing tech industry jokes, like “the year of the Linux desktop” and “Java security.” But 2017 really could be the year
that all the hub-bub and hype around the Internet of Things comes home to roost. The basic concept of connected devices – broadly, things that haven’t historically been connected to the internet suddenly being connected to the internet – is nothing
new. The uses to which the technology is now being put, however – smart cars, smart homes, and dramatically simplified The main problem is security, as it has been since people started thinking about IoT as a concept. There aren’t many commonly accepted standards for IoT devices –
though there’s no lack of candidates – and vendors don’t seem to work as hard to make connected devices secure as they do on more traditional endpoints, like laptops
and smartphones. That has big implications for security. Even if a hacked IoT device doesn’t represent much of a threat on its own, it’s simple enough to incorporate it into a vast
botnet, which is exactly what the attackers behind the Mirai botnet have been up to lately, exploiting DVRs, surveillance cameras and other poorly secured IoT devices
and making them into a zombie army able to hamstring internet access across the U.S. by attacking domain registration service provider Dyn. It’s a big challenge, according to analyst and Network World contributor Zeus Kerravala. “[IoT security] requires strengthened network access controls, including real-time application control and visibility, IoT-supported, secure-authentication methods
such as PPSK, granular device policy enforcement at the edge, and centralized reporting and monitoring tools,” he said, in commenting on a new IoT security offering
from Aerohive Networks. Forrester Research thinks more than half a million IoT devices will be compromised in 2017, which underlines the extreme importance of security. One way or another,
IoT will shake up computing in 2017 – either as a key underpinning of a host of new technologies, or the venue for further devastating cyberattacks. By Jon GoldAugmented reality and virtual reality will take offWhen the iPad was introduced in 2010, rarely would you see them in the wild—never mind being used for business. Now, iPads and tablets are everywhere. Their use
exploded. Prepare for the same thing to happen with virtual reality (VR) and augmented reality (AR)—with tablets and smartphones as the vehicle. According to IDC, 25% of
enterprise IT organizations will be testing augmented reality business applications for use on smartphones by the end of 2017. “This may sound relatively aggressive, but the conversations I’m having with the industry and some surveys that we’ve run talking to IT decision makers show that
there’s a really strong interest around augmented reality,” said Tom Mainelli, program vice president of the devices & AR/VR group at IDC, during a recent webinar, IDC
Futurescape: Worldwide Wearables and AR/VR 2017 Predictions. The end game is head-worn AR hardware, such as the Microsoft HoloLens, he said. But for a lot of enterprises, they are going to begin creating apps and back-end
processes on devices that consumers and businesses already own. Pokémon Go gave us a taste of AR, and we’ve seen retailers using AR technology. Walgreens and Toys R Us use an app called Aisle411 that guides customers to products
with the store. North Face provides 360-degree videos of outdoor experiences using Oculus Rift in which the actors wear North Face clothing. Audi has a virtual
experience that allows you to take virtual test drive and to virtually see features and options on their cars. And Ashley Furniture will soon have an AR app that helps
shoppers see how home furnishings fit into an existing space. As smartphone technology improves, we will see much better AR experiences, Mainelli said. The first product working toward that is the Lenovo Phab 2 Pro, which is
based on Google’s Tango technology. It uses three cameras and multiple sensors to see where it is and capture a wide range of measurements to create an enhanced AR
experience, he said. Other AR and VR predictions from IDC: •In 2017, retail industry spending on AR/VR hardware, software and services will increase by 145% to more than $1 billion.•Three out of 10 consumer-facing Fortune 5000 companies will experiment with AR or VR as part of their marketing efforts in 2017.•By 2019, 10% of all web-based meetings will include an AR component driving disruption of the $3 billion web conferencing market.“I really believe augmented reality is going to have the same type of impact on businesses as the PC did all those years ago,” Mainelli said. “And once developers
start to figure out what they can do with this technology, business is going to change pretty dramatically. … Eventually we will end up at a place where augmented
reality really is the new way that we interface with devices, digital content, physical objects and with data.” By Michelle Davidson
Triple A protection coming to world of cybersecurityIt may be a brave new world in 2017 but it’s also a damn scary one for IT security professionals. Just take a look at some recent Gartner assessments of the security situation: •By 2020, 60% of digital businesses will suffer major service failures, due to the inability of IT security teams to manage digital risk.•By 2020, 60% of enterprise information security budgets will be allocated for rapid detection and response approaches, which is an increase from less than 30% in
2016.•By 2018, 25% of corporate data traffic will flow directly from mobile devices to the cloud, bypassing enterprise security controls.•Through 2018, over 50% of IoT device manufacturers will not be able to address threats because of weak authentication practices.So what technologies are going to change this scenario back in favor of IT? The new security AAA — automation, analytics and artificial intelligence — say
proponents. When it comes to automation, security platforms will devise and execute controls based on newly detected threats and do it without human intervention. That reduces the
time between a compromise and the time the threat is neutralized – reducing the window during which attackers can do damage. Security analytics engines digest data from network gear and endpoints in search of anomalies that indicate threats. By setting a baseline for normal, these engines
spot out of the ordinary behaviors and assess whether they represent malicious activity. By incorporating AI and machine learning this technology will expand its ability to detect anomalies not only in network traffic, but in the behavior of individual
machines, users, and combinations of users on particular machines. As these platforms become more sophisticated and trusted in 2017, they will be able to spot attacks in earlier stages and stop them before they become active breaches. And the big guns are all involved in making this happen: Cisco with its Tetration Analytics platform, IBM with Watson cognitive computing for cybersecurity;
Google/Alphabet with DeepMind lab to name just a few. Cisco’s Tetration Analytics product is a turnkey package that gathers information from hardware and software sensors and analyzes the information using big data
analytics and machine learning. In the security realm the system sets a baseline for normal network and application behavior and quickly identifies any deviation in
communication patterns in real time or uses Tetration’s forensics search engine to look for other security or user behavior analytics. “The single most important things customers can do to protect the data center is set up a whitelist of who has access to what, but it is one of the most difficult
tasks to implement,” said Tom Edsall, a senior vice president and CTO with Cisco. “Tetration lets users set up a white list model and policies more quickly and
efficiently than they could before.” This capability will address key cybersecurity challenges and move toward the “self-driving data center” of the future, he said.  Cisco promises many new security-related applications will be layered onto Tetration. Then we have IBM’s Watson supercomputer, which is being unleashed in corporate networks to analyze traffic in search of malware, but also learning at the same time via
its own experiences and by taking in white papers, threat intelligence and news about cybercrime. So over time, Watson will develop new strategies for finding attacks
as they unfold. The Watson for Cybersecurity project is in beta now and likely sometime in 2017 could become a full-fledged cybersecurity service. Separately, there is governmental research underway that could impact the cybersecurity world this year as well. For example, Intelligence Advanced Research Projects
Activity, the radical research arm of the of the Office of the Director of National Intelligence, wants to build a system of what it calls sensors that can monitor
everything from search terms to social media output to look for early warning signs of cyberattacks. “Cyber-attacks evolve in a phased approach. Detection typically occurs in the later phases of an attack, and analysis often occurs post-mortem to investigate and
discover indicators from earlier phases. Observations of earlier attack phases, such as target reconnaissance, planning, and delivery, may enable warning of
significant cyber events prior to their most damaging phases,” IARPA wrote in announcing its Cyberattack Automated Unconventional Sensor Environment (CAUSE) program. “It is expected that the technology developed under the CAUSE Program will have no ‘human in the loop.’ Experts may help develop, train, and improve the solution
systems, but they will not manually generate warnings, guide the system, or filter warnings before they are delivered to the [IARPA] Team. The performer produced
warnings must be machine-generated and submitted automatically…,” IARPA wrote of the system. By Michael Cooney
Bullish for blockchainThere’s no shortage of hype around blockchain’s potential to revolutionize transactions. Heading into the new year, some enterprises will put blockchain hype to the
test as they start exploring its ability to reduce transaction costs, streamline partner interactions, and accelerate business processes. Blockchains are distributed public ledgers, lauded for their ability to establish trust in the digital world by way of verifiable transactions and without the need for
a middleman. The cryptocurrency bitcoin is the most familiar application. In the financial world, blockchains are expected to disrupt how financial institutions
conduct payments and wire transfers, process securities trades, and handle compliance reporting, to name just a few use cases. Outside of finance, industry watchers cite opportunities for blockchains to play a role in core business functions from supply chain and manufacturing to legal and
healthcare. When there’s an audit trail required – to track the provenance of finished goods, for example, or to document a real estate title – blockchain networks can
be used to create verifiable, tamper-proof records in an encrypted format and without having a central authority. Enterprise IT leaders “are not so much interested in secure, anonymous public networks like bitcoin but in closed networks that are between specific groups of people,
particularly between enterprises that have to interact,” says Roger Kay, founder and president of market intelligence firm Endpoint Technologies Associates. In a blockchain, each page in a ledger of transactions forms a block, which is linked via a cryptographic hash to the previous block, and new transactions are
authenticated across the distributed network before the next block is formed. “Blocks are always agreed on, and each one has an encrypted representation of everything
that happened before, so you can tell it’s authentic. You can’t tamper with the chain at any point,” Kay says. As a trust system, “it essentially eliminates the need
for a third-party guarantor.” That’s not to say blockchain technology is mature, however. “It’s still early days,” Kay warns. Early adopters have launched hundreds of pilot projects, but there’s a long way to go before blockchain hits mainstream adoption. Among the obstacles blockchain
deployments face are: technical challenges, lack of standards and governance models, shortage of skills, and scalability concerns. (Related: You’ve got blockchain
questions, we’ve got answers)As 2016 closes, vendors continue to devise distributed applications and platforms based on blockchain technology, and venture capital firms continue to pour money into
the effort. More than $1.4 billion has been invested in blockchain technology over the past three years, according to an August report by the World Economic Forum
(WEF). More than 90 corporations have joined blockchain development consortia, and more than 2,500 patents have been filed. The WEF predicts that by 2017, 80% of banks
will initiate projects that involve distributed ledger technology. For enterprises interested in exploring how they can use blockchain and distributed ledgers, research firm Gartner recommends starting with limited-scope trials that
are aimed at specific problems. Enterprises can start to investigate how distributed networks might improve business processes that are constrained by transaction
inefficiency and how technology suppliers might be able to help. “The challenge for blockchain users and CIOs is to set appropriate expectations among business leaders,” Gartner writes in its 2017 strategic predictions report. “Plan
for a reasonable rollout, failure and recovery (especially through 2018); develop realistic proof of concept (POC) use cases; and be agile from an IT and business
perspective to follow the best path to success.” By Ann Bednarz
Machine Learning – the promise of predicting the futureHistorically, the challenge for organizations that want to use machine learning and cognitive computing technologies has been that it requires hiring expert data
scientists who have spent their careers studying how to crunch data into artificial intelligence algorithms. In recent years, thanks to the proliferation of public cloud computing platforms, that’s changing. Companies like Amazon Web Services, Google, Microsoft and IBM have
all rolled out cloud-based machine learning platforms. “It’s really lowered the barrier quite a bit,” says Sam Charrington, an analyst and blogger who tracks the
machine learning market, adding that the technology is being democratized for everyday developers to use in their applications. At its most basic level, machine learning is the process of using data to make predictions of future behavior. Most commonly it’s been used in fraud protection
(training computers to detect anomalous behavior) and teaching programs to predict future revenues and customer churn. IBM has trained its Watson platform to create
sophisticated chatbots for customer interaction and to help healthcare workers provide better care. It’s still early days for adoption though: A recent study by consultancy Deloitte reported that only 8% of enterprises use machine learening technology today. Allied
Market Research predicts the industry is growing at a 33% compound annual growth rate and will reach $13.7 billion by 2020. “The practice of employing algorithms to parse data, learn from it, and then make a determination … is gathering speed,” reports 451 Researcher Krishna Roy. Consumer
adoption of platforms like Amazon’s Echo and Apple’s Siri has seeded this market, but enterprise adoption has been held back by a lack of market education and
integration of these systems with existing enterprise platforms. But, she notes that one day this technology could become a “fundamental part of an enterprise’s
analytics fabric.” By Brandon Butler

 

find the cost of your paper

Would you like us to remove this content? Get in touch Writemyclassessay.com

Is this question part of your Assignment?

We can help

Our aim is to help you get A+ grades on your Coursework.

We handle assignments in a multiplicity of subject areas including Admission Essays, General Essays, Case Studies, Coursework, Dissertations, Editing, Research Papers, and Research proposals

Header Button Label: Get Started NowGet Started Header Button Label: View writing samplesView writing samples