Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

We are increasingly seeing new trends in application of emerging technologies, such as blockchain, audit analytics and continuous auditing, artificial intelligence and others in the

We are increasingly seeing new trends in application of emerging technologies, such as blockchain, audit analytics and continuous auditing, artificial intelligence and others in the public sector. Please take one of these trends and think about how it might affect public sector operations in the next five years and how governments can prepare for adoption of these emerging technologies. This is a discussion thread which needs to be explained in long paragraphs. See readings below on machine learning, deep learning, artificial intelligence, and blockchain.

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed

Machine Learning Let computers learn and adapt. While just a few years ago, this would sound like science fiction, machine learning is today a mainstream topic of discussion-not only in research universities but also in business and government. As a discipline, machine learning sits at the intersection of computer science and statistics. If data mining represents the broad category of activities for knowledge extraction performed on the data, machine learning is the computer science approach to accomplish the same goal. Knowingly or not, you have already come in contact with machine learning applications. Search engine web page classification, Facebook's "People You Might Know" feature, Airbnb Search Ranking, some videogames player-matching scenarios, product or movie recommendations, and face and object detection are all areas where machine learning is intensively being used. At a more conceptual level, the idea of machine learning is to develop algorithms that let digital computers identify or discriminate patterns (e.g., the pattern of pixels in an image, the pattern of terms in a tweet) without needing the programmers to code an explicit procedure (i.e., rules) but letting the machine learn the patterns from the data themselves. Machine learning is particularly useful for solving problems that are difficult to create rules for. Consider face recognition. While humans intuitively perform this activity, could you explain how you do it? Could you codify the set of rules" your brain applies to every face to distinguish known from unknown ones? What about the procedure to put a name to the face? When Facebook suggests a tag, the application is asking confirmation for associating the name and profile of a person to the pattern of facial features it has detected. In that case you are training Facebook machine learning algorithms to recognizing that person. You can see why machine learning has become so popular as of late. The trends we discussed throughout this book, such as the growing amount and variety of data being continuously produced by humans and machines alike, make it viable to train and useful (almost necessary!) to adopt machine learning algorithms. There are two general categories of machine learning algorithms: supervised and unsupervised. Supervised machine learning generates a predictive model from a known set of training examples known as gold standard. In other words, the terms-or features-of the problem are known, and it is up to the machine to learn how to correctly answer given a new input (Figure 12.17). Supervised machine learning can perform extremely well in contexts where there is a reliable gold standard or it can be developed, and a number of recent success stories in machine learning leverage established supervised algorithms. Consider the example of detecting fake and fraudulent online reviews. A team of scientists at Cornell University used supervised machine learning to achieve 89% accuracy when classifying TripAdvisor reviews. To do so, they commissioned 400 known fake reviews through Amazon Mechanical Turk and built a gold standard to train their classifier. Handwriting recognition provides a similar example, as does image recognition when algorithms are trained by humans who classified a small set of images (e.g., tagging your friends' faces on Facebook). Unsupervised machine learning is helpful in situations where a gold standard does not exist or the domain is constantly evolving and a gold standard would not be relevant over time. A typical example is that of machines being able to successfully play games. The first high-profile example of its kind was the famous IBM Deep Bluethe computer program that in 1997 beat reigning World Chess Champion Garry Kasparov under tournament conditions. Unsupervised machine learning has since improved dramatically, and the consensus is now that an algorithm that is allowed to play a game with a clear set of rules and objective function (i.e., a goal) will always beat unaided humans, no matter how complex the game. The defining moment for this recognition was AlphaGo's win against 18-time world Go champion Lee Sedol in March 2016.- Go is an ancient Chinese strategy board game, similar to chess, long considered the ultimate frontier for machine learning due to the staggering number of possible board combinations that can occur during a match. Deep Learning Much of the current progress of machine learning is in an area called deep learning. Deep learning is an approach to machine learning that, using neural networks. mimics the way the brain works. Deep learning models seek to parameterize (i.e., learn) a discriminant hierarchical structure of features directly from input data. In perspective, this should make learning algorithms better and more autonomous. More specifically, the aim is to reduce the efforts in determining the features necessary for successful discriminationtoday still a relatively labor-intensive process. Deep learning gained its early popularity in 2011 from the success of the Google Brain project, where a deep learning algorithm was able to recognize high-level concepts like faces, human bodies, and cats after being trained with only a random sample of 200 x 200 pixel images from YouTube. The technical breakthrough in deep learning, and machine learning in general, is the development of programmable graphical processing units (GPUs). GPUs are specialized microchips with an instruction set optimized for image processing. They excel at performing the same operation on many data elements very quickly. This enables them to carry out functions such as shading an area of the screen, moving a window or an icon from one part of the screen to another, or decoding compressed video frames very quickly. GPUs were originally introduced for computer graphics in video processing and gaming devices. Much of the efficiency of GPUs came from the fact that they were hard-coded to perform very specific instruction. But in the early 2000s, programmable GPUs became a reality with the introduction of shadersspecialized algorithms that run on the GPU. Coupled with the relentless effects of Moore's law (see Chapter 1) in increasing the number of transistors on microchips, the result is that a modern high-end GPU such as NVidia's GTX 1080 Ti can run more than 3,500 highly optimized programs (i.e., shaders) in parallel. While programmable GPUs were originally designed to speed up graphics processing, the type of fundamental computations required by machine learning have similar characteristics. It is the availability of massive computational power made available by programmable GPUs, coupled with the unprecedented availability of training data, that underpins the recent success and widespread adoption of deep learning algorithms. Deep learning algorithms are the building blocks of autonomous driving algorithms. In one recent high-profile case, two research teams showed that a set of deep learning algorithms could reliably beat some of the best poker players in the world. Libratus, from Carnegie Mellon University (United States), and DeepStack, from the University of Alberta (Canada), both proved very successful in tournament-style games of Texas Hold'em. The results are notable because, unlike board games such as chess and Go, poker is a game characterized by imperfect information, luck, and even misinformation (i.e., bluffing). Despite all the recent success and the press hype, it is important for you as a manager to realize the remaining current limitations of machine learning. Machine learning algorithms perform best in scenarios where there are millions of reliably labeled data (e.g., cat pictures versus images where no cats appear). Alternatively, as in deep learning algorithms that can play games, it is necessary that the game" has clear rules and that it is possible to run millions of simulations or training experiments. Despite high-profile research examples, there are still limited real-world scenarios where these conditions are met in business life. Murray Campbell, one of the original creators of IBM Deep Blue, put it best: "While poker is a step more complex than perfect information games, it's still a long way to go to get to the messiness of the real world." A Note about Artificial Intelligence Attentive readers will have noted that we use exclusively the term machine learning in this section, and unlike much of the business press, we do not treat the term as synonymous with artificial intelligence (AI). AI is a computer science concept that is almost as old as the discipline. To be convinced, consider that the general test for artificial intelligence systems is the Turing test, named after the British mathematician Alan Turing, who proposed it in 1950. As normally interpreted, the test suggests that a machine can be said to exhibit thinking ability if it can fool a human, making him or her unable to tell the machine answers apart from those of a real person. However, the term Al is confusing because it engenders visions of machines that become sentient and similar to humans. It should, however, be fairly clear from the above discussion of machine learning and deep learning that computers are not as of yet, at least) intelligent. Rather, under specific conditions, algorithms such as neural networks are able to parameterize thousands of mathematical functions based on available training data in order to engage in reliable classification and prediction. In other words, learning for a computer is about performing mathematical operations. Thus machine learning is very different from human learning, and framing it in terms of artificial intelligence" is more confusing than helpful. Some experts even argue that the term AI should be banned. Francois Cholet, one of the foremost deep learning experts at Google and the author of the Keras framework, explained it best: "Human perception involves considerable amounts of abstraction and symbolic reasoning-unlike the input-output matching performed by machine perception' models. In conclusion, while fears of the near-term development of artificial super intelligence or the rise of the machines" may be overblown, there is no doubt that machine learning algorithms will continue to revolutionize various aspects of our lives. The most productive approach for you as a manager will be to think about machine learning as a foundational technology that will become increasingly embedded in information systems and applied to a wide array of problems. Consider computer vision and face recognition, a problem that has largely been "solved" by deep learning. Benedict Evans, a partner with the Venture Capital firm Andreessen Horowitz, makes this point (Figure 12.18): Eric Raymond proposed that a computer should never ask the user for any information that it can autodetect, copy, or deduce; computer vision changes what the computer has to ask. So it's not really, a camera, taking photosit's more like an eye, that can see. Tangible applications of what Evans means are already available. On the one hand, computer vision is automating and systematizing existing processes, like the scanning and detection of known persons of interest at airport checkpoints or other high-traffic venues (e.g., stadiums). As with any technology, however, computer vision will start by automating existing processes but very rapidly thereafter will begin to change the way we perform work and create opportunities for novel activities. In 2017, Google launched Google Clips (Figure 12.19). In an article aptly titled "The Google Clips Camera Puts Al behind the Lens," The Verge explained the promise of a standalone camera that uses machine learning to independently decide when to take a snapshot: Blockchain Bitcoin, a decentralized cryptocurrency system conceived by a mysterious person or group of people) using the pseudonym "Satoshi Nakamoto," promises to revolutionize the way we think about money and the transfer of value in general.After a quiet launch in 2009, it has evolved into a multibillion dollar industry, and it has inspired the creation of hundreds of similar cryptocurrency systems. While cryptocurrencies are receiving much attention in the business press, caused probably by the rampant financial speculation around them (Figure 12.20). much more relevant for managers is an understanding of the underlying technologythe blockchain. The reason is that blockchain technology holds the potential to revolutionize record keeping, contract registration, and transaction management in a way that parallels the introduction of SQL and relational database management systems (see Chapter 3) in the 1970s. The blockchain is engaged when a user wants to initiate a transaction, which in the case of Bitcoin is a financial transaction but could more generally be any transaction that needs to be recorded (e.g., the sale of an asset). The user digitally signs with his or her private key, a message referencing a previous transaction like the previous sale of the asset, or in the case of Bitcoin, the unspent transaction outputs (UTXO) from a previous transaction. The user then indicates the recipient's public address and the amount of Bitcoin or the asset that will be transferred. The combination of public and private keys acts as a unique identifier of the users and removes the need for a central authority to assign accounts and identities. This transaction is then broadcasted and propagated via the peer-to-peer blockchain network to all full nodes. In order to compensate for the absence of a central authority that ensures the accuracy and legitimacy of the transactions, all full nodes keep a complete copy of the global ledger so that they can independently verify that the asset belongs to the entity claiming ownership. The full nodes, in fact, have a complete history of all transactions, similar to an accounting log. In order for a new transaction to be recorded on the blockchain, a second type of nodes, the miners, take available unconfirmed transactions and group them together in a candidate block. The miners compete to validate the candidate block by engaging in a computational race that consists of solving a cryptographic challenge to find a special number called nonce. The miner who manages to solve the puzzle first immediately submits the block and a proof-of-work (PoW) to the rest of network. which accepts the block as valid if it meets all requirements. All miners then begin searching for a new block that will reference this newly recorded valid block. It is the continuous chain of all these cryptographically linked blocksthe blockchain, as it is calledthat provides the instrument for record keeping. All full nodes on the network thus maintain a shared state of the global ledger that everybody agrees on by independently recomputing the whole history of transactions starting from the genesis block, the first block mined to launch the network. The system works because miners who provide costly resources, mostly electricity and computing power, do so in search of a monetary reward, a fee plus some amount of new currency (e.g., Bitcoin) that is released for each new valid block added to the blockchain. The core innovation of Bitcoin was not the blockchain itself but rather the so-called Nakamoto Consensusan approach ensuring that miners behave honestly by making it more profitable for them to support the integrity of the system rather than undermine it by allowing transactions. Our description of the blockchain is not designed to make you an expert on the technology, Bitcoin, or cryptocurrencies in general. There are a lot of freely available resources for those who want to wade deeper into the technical aspects of the blockchain. But even with this limited understanding, you can begin to appreciate the business appeal of this new technology. A blockchain is in fact a database with very desirable properties: Distributed ownership. Transaction records, collected in validated blocks, can be stored by any entity interested in doing so. Thus no individual entity represents a concentrated point of failure for the overall record-keeping system. Built-in validation. Because of the requirements for block validation, the blockchain ensures that no one individual entity can tamper with the records. Old transactions are preserved forever, and new additions are irreversible. Transparency. Anyone who joins the blockchain can check the ledger and reconstruct the full history of transactions that have occurred since the system's inception Incentivized by the success of Bitcoin and by the promise of blockchain technology, hundreds of startups have entered the space. Most notable of all is Ethereum, whose website does not mention currency but rather reads, "Ethereum is a decentralized platform that runs smart contracts: applications that run exactly as programmed without any possibility of downtime, censorship, fraud or third party interference. *** As with machine learning, we believe that the power of the blockchain is in its promise to dramatically change the way institutions and organizations work. Summary In this chapter, we introduced some emerging and some enduring trends in information systems (IS) and technology management. Understanding these trends and technologies, the associated vocabulary, and the benefits and risks they engender for modern organizations is critical for you as a general or functional manager, as you will be called upon to participate in the debate about whether your firm should embark in initiatives that leverage the technologies and trends discussed in this chapter. Specifically, in this chapter we learned the following: The Internet of things (IoT), made of smart objects, is set to bring a totally fresh new breed of DDS. The IoT paradigm radically changes how value is created for customers and competition among firms and its boundaries. In this sense, new business ecosystems will arise, compete, and eventually coexist. The widespread adoption of information technology and the increasing computer mediation of organizational and social processes has created the possibility to utilize data that are born digital. The digital data genesis (DDG) trend creates the opportunity. The sensors introduced in objects like the smartphone are an illustrative example of this trend. The continuous digital encoding and transmission of data related to the captured events generate digital data streams (DDSs). Virtual, augmented, or mixed realities are increasingly representative classes of technologies capable of immersing the user in digital environments. These technologies are bridging the separation between artificial and real world with the aim of augmenting the sensorial experience, thus enabling new interaction models. . With digital manufacturing, it is possible to directly print objects from their digital representation and design. The flexibility of the process and 3D printers increasing capabilities generate the opportunity of a new breed of products and services, disrupting current product manufacturing practices. Advanced analytics move the interest of data analysis and gathering to external sources, providing the opportunity for greater insight. The heterogeneity of the data, their massive volume, and greater speed challenge established analysis practices, skills, and technologies. Machine learning, and deep learning in particular, makes algorithms identify occurrences and unknown patterns in data and can be trained to look for the same occurrences in new data sets. It's like having an analyst capable of looking for trends within amounts of data unbearable for humans. This opens to a new category of applications that from data can derive reliable predictions. Instead of causality, machine learning derives empirical models fitting the available data. The renewed interest in AI, fueled by the success of machine learning based applications, is revamping the debate on artificial cognition. While still far away from sentient machines capable of fooling humans by passing the Turing test, machine learning technologies are getting increasingly embedded in information systems and applied to a growing array of problems. Blockchain distributed ledger gave birth to a flourishing ecosystem of applications leveraging its main characteristics of distributed ownership, built-in validation, and transparency. Bitcoin is the most notable example of cryptocurrencies based on blockchain technology, in which the absence of centralized authority is challenging the role of the banking system as the trustee for monetary exchanges. Machine Learning Let computers learn and adapt. While just a few years ago, this would sound like science fiction, machine learning is today a mainstream topic of discussion-not only in research universities but also in business and government. As a discipline, machine learning sits at the intersection of computer science and statistics. If data mining represents the broad category of activities for knowledge extraction performed on the data, machine learning is the computer science approach to accomplish the same goal. Knowingly or not, you have already come in contact with machine learning applications. Search engine web page classification, Facebook's "People You Might Know" feature, Airbnb Search Ranking, some videogames player-matching scenarios, product or movie recommendations, and face and object detection are all areas where machine learning is intensively being used. At a more conceptual level, the idea of machine learning is to develop algorithms that let digital computers identify or discriminate patterns (e.g., the pattern of pixels in an image, the pattern of terms in a tweet) without needing the programmers to code an explicit procedure (i.e., rules) but letting the machine learn the patterns from the data themselves. Machine learning is particularly useful for solving problems that are difficult to create rules for. Consider face recognition. While humans intuitively perform this activity, could you explain how you do it? Could you codify the set of rules" your brain applies to every face to distinguish known from unknown ones? What about the procedure to put a name to the face? When Facebook suggests a tag, the application is asking confirmation for associating the name and profile of a person to the pattern of facial features it has detected. In that case you are training Facebook machine learning algorithms to recognizing that person. You can see why machine learning has become so popular as of late. The trends we discussed throughout this book, such as the growing amount and variety of data being continuously produced by humans and machines alike, make it viable to train and useful (almost necessary!) to adopt machine learning algorithms. There are two general categories of machine learning algorithms: supervised and unsupervised. Supervised machine learning generates a predictive model from a known set of training examples known as gold standard. In other words, the terms-or features-of the problem are known, and it is up to the machine to learn how to correctly answer given a new input (Figure 12.17). Supervised machine learning can perform extremely well in contexts where there is a reliable gold standard or it can be developed, and a number of recent success stories in machine learning leverage established supervised algorithms. Consider the example of detecting fake and fraudulent online reviews. A team of scientists at Cornell University used supervised machine learning to achieve 89% accuracy when classifying TripAdvisor reviews. To do so, they commissioned 400 known fake reviews through Amazon Mechanical Turk and built a gold standard to train their classifier. Handwriting recognition provides a similar example, as does image recognition when algorithms are trained by humans who classified a small set of images (e.g., tagging your friends' faces on Facebook). Unsupervised machine learning is helpful in situations where a gold standard does not exist or the domain is constantly evolving and a gold standard would not be relevant over time. A typical example is that of machines being able to successfully play games. The first high-profile example of its kind was the famous IBM Deep Bluethe computer program that in 1997 beat reigning World Chess Champion Garry Kasparov under tournament conditions. Unsupervised machine learning has since improved dramatically, and the consensus is now that an algorithm that is allowed to play a game with a clear set of rules and objective function (i.e., a goal) will always beat unaided humans, no matter how complex the game. The defining moment for this recognition was AlphaGo's win against 18-time world Go champion Lee Sedol in March 2016.- Go is an ancient Chinese strategy board game, similar to chess, long considered the ultimate frontier for machine learning due to the staggering number of possible board combinations that can occur during a match. Deep Learning Much of the current progress of machine learning is in an area called deep learning. Deep learning is an approach to machine learning that, using neural networks. mimics the way the brain works. Deep learning models seek to parameterize (i.e., learn) a discriminant hierarchical structure of features directly from input data. In perspective, this should make learning algorithms better and more autonomous. More specifically, the aim is to reduce the efforts in determining the features necessary for successful discriminationtoday still a relatively labor-intensive process. Deep learning gained its early popularity in 2011 from the success of the Google Brain project, where a deep learning algorithm was able to recognize high-level concepts like faces, human bodies, and cats after being trained with only a random sample of 200 x 200 pixel images from YouTube. The technical breakthrough in deep learning, and machine learning in general, is the development of programmable graphical processing units (GPUs). GPUs are specialized microchips with an instruction set optimized for image processing. They excel at performing the same operation on many data elements very quickly. This enables them to carry out functions such as shading an area of the screen, moving a window or an icon from one part of the screen to another, or decoding compressed video frames very quickly. GPUs were originally introduced for computer graphics in video processing and gaming devices. Much of the efficiency of GPUs came from the fact that they were hard-coded to perform very specific instruction. But in the early 2000s, programmable GPUs became a reality with the introduction of shadersspecialized algorithms that run on the GPU. Coupled with the relentless effects of Moore's law (see Chapter 1) in increasing the number of transistors on microchips, the result is that a modern high-end GPU such as NVidia's GTX 1080 Ti can run more than 3,500 highly optimized programs (i.e., shaders) in parallel. While programmable GPUs were originally designed to speed up graphics processing, the type of fundamental computations required by machine learning have similar characteristics. It is the availability of massive computational power made available by programmable GPUs, coupled with the unprecedented availability of training data, that underpins the recent success and widespread adoption of deep learning algorithms. Deep learning algorithms are the building blocks of autonomous driving algorithms. In one recent high-profile case, two research teams showed that a set of deep learning algorithms could reliably beat some of the best poker players in the world. Libratus, from Carnegie Mellon University (United States), and DeepStack, from the University of Alberta (Canada), both proved very successful in tournament-style games of Texas Hold'em. The results are notable because, unlike board games such as chess and Go, poker is a game characterized by imperfect information, luck, and even misinformation (i.e., bluffing). Despite all the recent success and the press hype, it is important for you as a manager to realize the remaining current limitations of machine learning. Machine learning algorithms perform best in scenarios where there are millions of reliably labeled data (e.g., cat pictures versus images where no cats appear). Alternatively, as in deep learning algorithms that can play games, it is necessary that the game" has clear rules and that it is possible to run millions of simulations or training experiments. Despite high-profile research examples, there are still limited real-world scenarios where these conditions are met in business life. Murray Campbell, one of the original creators of IBM Deep Blue, put it best: "While poker is a step more complex than perfect information games, it's still a long way to go to get to the messiness of the real world." A Note about Artificial Intelligence Attentive readers will have noted that we use exclusively the term machine learning in this section, and unlike much of the business press, we do not treat the term as synonymous with artificial intelligence (AI). AI is a computer science concept that is almost as old as the discipline. To be convinced, consider that the general test for artificial intelligence systems is the Turing test, named after the British mathematician Alan Turing, who proposed it in 1950. As normally interpreted, the test suggests that a machine can be said to exhibit thinking ability if it can fool a human, making him or her unable to tell the machine answers apart from those of a real person. However, the term Al is confusing because it engenders visions of machines that become sentient and similar to humans. It should, however, be fairly clear from the above discussion of machine learning and deep learning that computers are not as of yet, at least) intelligent. Rather, under specific conditions, algorithms such as neural networks are able to parameterize thousands of mathematical functions based on available training data in order to engage in reliable classification and prediction. In other words, learning for a computer is about performing mathematical operations. Thus machine learning is very different from human learning, and framing it in terms of artificial intelligence" is more confusing than helpful. Some experts even argue that the term AI should be banned. Francois Cholet, one of the foremost deep learning experts at Google and the author of the Keras framework, explained it best: "Human perception involves considerable amounts of abstraction and symbolic reasoning-unlike the input-output matching performed by machine perception' models. In conclusion, while fears of the near-term development of artificial super intelligence or the rise of the machines" may be overblown, there is no doubt that machine learning algorithms will continue to revolutionize various aspects of our lives. The most productive approach for you as a manager will be to think about machine learning as a foundational technology that will become increasingly embedded in information systems and applied to a wide array of problems. Consider computer vision and face recognition, a problem that has largely been "solved" by deep learning. Benedict Evans, a partner with the Venture Capital firm Andreessen Horowitz, makes this point (Figure 12.18): Eric Raymond proposed that a computer should never ask the user for any information that it can autodetect, copy, or deduce; computer vision changes what the computer has to ask. So it's not really, a camera, taking photosit's more like an eye, that can see. Tangible applications of what Evans means are already available. On the one hand, computer vision is automating and systematizing existing processes, like the scanning and detection of known persons of interest at airport checkpoints or other high-traffic venues (e.g., stadiums). As with any technology, however, computer vision will start by automating existing processes but very rapidly thereafter will begin to change the way we perform work and create opportunities for novel activities. In 2017, Google launched Google Clips (Figure 12.19). In an article aptly titled "The Google Clips Camera Puts Al behind the Lens," The Verge explained the promise of a standalone camera that uses machine learning to independently decide when to take a snapshot: Blockchain Bitcoin, a decentralized cryptocurrency system conceived by a mysterious person or group of people) using the pseudonym "Satoshi Nakamoto," promises to revolutionize the way we think about money and the transfer of value in general.After a quiet launch in 2009, it has evolved into a multibillion dollar industry, and it has inspired the creation of hundreds of similar cryptocurrency systems. While cryptocurrencies are receiving much attention in the business press, caused probably by the rampant financial speculation around them (Figure 12.20). much more relevant for managers is an understanding of the underlying technologythe blockchain. The reason is that blockchain technology holds the potential to revolutionize record keeping, contract registration, and transaction management in a way that parallels the introduction of SQL and relational database management systems (see Chapter 3) in the 1970s. The blockchain is engaged when a user wants to initiate a transaction, which in the case of Bitcoin is a financial transaction but could more generally be any transaction that needs to be recorded (e.g., the sale of an asset). The user digitally signs with his or her private key, a message referencing a previous transaction like the previous sale of the asset, or in the case of Bitcoin, the unspent transaction outputs (UTXO) from a previous transaction. The user then indicates the recipient's public address and the amount of Bitcoin or the asset that will be transferred. The combination of public and private keys acts as a unique identifier of the users and removes the need for a central authority to assign accounts and identities. This transaction is then broadcasted and propagated via the peer-to-peer blockchain network to all full nodes. In order to compensate for the absence of a central authority that ensures the accuracy and legitimacy of the transactions, all full nodes keep a complete copy of the global ledger so that they can independently verify that the asset belongs to the entity claiming ownership. The full nodes, in fact, have a complete history of all transactions, similar to an accounting log. In order for a new transaction to be recorded on the blockchain, a second type of nodes, the miners, take available unconfirmed transactions and group them together in a candidate block. The miners compete to validate the candidate block by engaging in a computational race that consists of solving a cryptographic challenge to find a special number called nonce. The miner who manages to solve the puzzle first immediately submits the block and a proof-of-work (PoW) to the rest of network. which accepts the block as valid if it meets all requirements. All miners then begin searching for a new block that will reference this newly recorded valid block. It is the continuous chain of all these cryptographically linked blocksthe blockchain, as it is calledthat provides the instrument for record keeping. All full nodes on the network thus maintain a shared state of the global ledger that everybody agrees on by independently recomputing the whole history of transactions starting from the genesis block, the first block mined to launch the network. The system works because miners who provide costly resources, mostly electricity and computing power, do so in search of a monetary reward, a fee plus some amount of new currency (e.g., Bitcoin) that is released for each new valid block added to the blockchain. The core innovation of Bitcoin was not the blockchain itself but rather the so-called Nakamoto Consensusan approach ensuring that miners behave honestly by making it more profitable for them to support the integrity of the system rather than undermine it by allowing transactions. Our description of the blockchain is not designed to make you an expert on the technology, Bitcoin, or cryptocurrencies in general. There are a lot of freely available resources for those who want to wade deeper into the technical aspects of the blockchain. But even with this limited understanding, you can begin to appreciate the business appeal of this new technology. A blockchain is in fact a database with very desirable properties: Distributed ownership. Transaction records, collected in validated blocks, can be stored by any entity interested in doing so. Thus no individual entity represents a concentrated point of failure for the overall record-keeping system. Built-in validation. Because of the requirements for block validation, the blockchain ensures that no one individual entity can tamper with the records. Old transactions are preserved forever, and new additions are irreversible. Transparency. Anyone who joins the blockchain can check the ledger and reconstruct the full history of transactions that have occurred since the system's inception Incentivized by the success of Bitcoin and by the promise of blockchain technology, hundreds of startups have entered the space. Most notable of all is Ethereum, whose website does not mention currency but rather reads, "Ethereum is a decentralized platform that runs smart contracts: applications that run exactly as programmed without any possibility of downtime, censorship, fraud or third party interference. *** As with machine learning, we believe that the power of the blockchain is in its promise to dramatically change the way institutions and organizations work. Summary In this chapter, we introduced some emerging and some enduring trends in information systems (IS) and technology management. Understanding these trends and technologies, the associated vocabulary, and the benefits and risks they engender for modern organizations is critical for you as a general or functional manager, as you will be called upon to participate in the debate about whether your firm should embark in initiatives that leverage the technologies and trends discussed in this chapter. Specifically, in this chapter we learned the following: The Internet of things (IoT), made of smart objects, is set to bring a totally fresh new breed of DDS. The IoT paradigm radically changes how value is created for customers and competition among firms and its boundaries. In this sense, new business ecosystems will arise, compete, and eventually coexist. The widespread adoption of information technology and the increasing computer mediation of organizational and social processes has created the possibility to utilize data that are born digital. The digital data genesis (DDG) trend creates the opportunity. The sensors introduced in objects like the smartphone are an illustrative example of this trend. The continuous digital encoding and transmission of data related to the captured events generate digital data streams (DDSs). Virtual, augmented, or mixed realities are increasingly representative classes of technologies capable of immersing the user in digital environments. These technologies are bridging the separation between artificial and real world with the aim of augmenting the sensorial experience, thus enabling new interaction models. . With digital manufacturing, it is possible to directly print objects from their digital representation and design. The flexibility of the process and 3D printers increasing capabilities generate the opportunity of a new breed of products and services, disrupting current product manufacturing practices. Advanced analytics move the interest of data analysis and gathering to external sources, providing the opportunity for greater insight. The heterogeneity of the data, their massive volume, and greater speed challenge established analysis practices, skills, and technologies. Machine learning, and deep learning in particular, makes algorithms identify occurrences and unknown patterns in data and can be trained to look for the same occurrences in new data sets. It's like having an analyst capable of looking for trends within amounts of data unbearable for humans. This opens to a new category of applications that from data can derive reliable predictions. Instead of causality, machine learning derives empirical models fitting the available data. The renewed interest in AI, fueled by the success of machine learning based applications, is revamping the debate on artificial cognition. While still far away from sentient machines capable of fooling humans by passing the Turing test, machine learning technologies are getting increasingly embedded in information systems and applied to a growing array of problems. Blockchain distributed ledger gave birth to a flourishing ecosystem of applications leveraging its main characteristics of distributed ownership, built-in validation, and transparency. Bitcoin is the most notable example of cryptocurrencies based on blockchain technology, in which the absence of centralized authority is challenging the role of the banking system as the trustee for monetary exchanges

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Accounting Multicolumn Journal

Authors: Claudia Gilbertson

10th Edition

128552845X, 9781285528458

More Books

Students also viewed these Accounting questions