Category Archives: Artificial Intelligence

Why Cryptocurrency Should be Killed


If you are following the financial markets, I’m very sure you have seen or read about cryptocurrency. It is a digital currency that was created and became public available in 2009. The value of cryptocurrency is volatile and has gone up and down within a very short time period.

I’m not going to try to explain what it is because I do not have the necessary understanding how it works other than that it requires a huge amount of computing power to solve a mathematical problem. Even if I could, I would not be able to explain it in this post. For a full definition, please refer to Wikipedia on Cryptocurrency.

My career has always in banking and that means traditional ways of managing value of money. Cryptocurrency is on the other side of money where there is no value but entirely speculative. There continues to be a lot of arguments cryptocurrency is the way of future. I don’t disagree with new technology and new methods and valuing certain materials, but I do not agree with cryptocurrency and I believe it should be killed.

Before you jump on my throat in defending the existence of cryptocurrency, please consider my points below. After reading them I hope you agree with me that cryptocurrency is not an optimum solution.


Cryptocurrency such as Bitcoin does not hold any value and it is entirely speculative. The value exists when the traders buy or sell at an expected value. I understand that today’s money such as US dollars no longer pegged to gold, but the value of a single dollar can be pegged to other currencies and allow each currency to be exchange with one another. If you watch the value of Bitcoin in the last several years you will notice that its “value” swing wildly. As an example, if a dealership is willing to accept a Bitcoin and sell a car that costs $20,000. The same Bitcoin coin could worth $50,000 or $50 the next day.

Lack of Regulation

The regulators are slow in catching up with cryptocurrency. It is not because they do not understand or know of its existence. It is because they are no able to regulate it when its value lives in the cloud on a database. Additionally, now almost everyone can create their own currency have it publicly traded. We all know about Bitcoin. Now there is Ethereum, created by a teen as a hobby and Dogecoin as a joke. Additionally, cryptocurrency is preferred payment in the darkweb where online crimes take place.

The End of the World

Yes, you read the title right. Cryptocurrency could mean the end of the world. No, I’m not referring to how one could use Bitcoin to buy a nuclear bomb. I’m referring to how cryptocurrency could plunder the world into darkness because of the resources it uses to operate. For example, Bitcoin uses 13GW (gigawatt) of energy per year. It is as much as the entire country of Netherlands uses in a year. With the continuous stream of people jumping on the bandwagon to mine the coins, its use of energy will continue to go up to the point that it will overwhelm the power grid. The natural resources around the world are finite which makes mining of Bitcoin unsustainable in the long run.

Shortage of rare earth minerals is another concern as computers components are being gobbled up by the miners. Mining cryptocurrency requires a huge amount of computing power to solve the mathematical equation and to do that miners are using computer graphic cards to do the heavy lifting. There is a shortage of microchips and graphic cards worldwide today. This resulted in artificial increase in prices of these components. The downstream effect is a lot of consumers are having difficulty getting these items. It could mean life or death to some if the supplies of this commodity dries out.

Another shocker that I read recently of a new cryptocurrency on the block. Chia (no, not the chia pet) is a new publicly traded cryptocurrency. This currency can fill up over 3.6 EB (exabytes) of hard drive in just a few weeks. For illustration purpose, personal computers 30 years ago used to come with a 40 MB (megabytes) hard drive. It took that long for personal computers to come with 2 TB (terabytes) of hard drive today. Network servers usually contain petabyte of hard drives. Now imagine just a week to fill up 1 EB of hard drive.

I understand that how one would be interested in cryptocurrency because of its potential wealth that it could generate. However, I truly believe that cryptocurrency could be the downfall of modern society. The harm that it causes the environment alone should be the reason why cryptocurrency should be outlawed.

Anyone who is into cryptocurrency should heed the old saying that “If it is not broken don’t fix it”. Cryptocurrency is not revolution but it is a fix that no one asked for. Albert Einstein regretted his decision when he discovered nuclear energy because of its risk posed to the world. The creator of cryptocurrency, Satoshi Nakamoto, is claimed to have disappeared after he release his white paper. As of today, his identity remains unknow. Perhaps he felt the same as Albert Einstein.

Is the U.S. Education System Prepared for the Future?

Photo by Andrea Piacquadio on

Recently I sat down with my teen daughter going over her math homework. She wanted me to check her homework. Sad to say that I was able to solve the basic algebra but anything more than that I was clueless. So my daughter asked me what I studied when I was in high school. I told her I learned the same math and I forgot most of them because I never have to use them in real life. Then she asked me why she has to learn something that she never have to use. Yes, indeed why?

I grew up in Malaysia and learned using the education system that is very different from the US. However, both of them have one thing in common, they tend to teach you subjects that you will never use again. Is this necessary or just a waste of time?

I’ve been working professionally for over 20 years and I can tell you my profession never have to use math above grade 6. I never have to use science because as a banker that is not required. I never have to recite history because none of them are prerequisites for employment. So why do we have to learn all these subjects in this age where majority of them are not applicable in real life. Plus we have everything at our finger tips? Why is the education system today so archaic that it fails to recognize that learning life subject will prepare the students better than learning these core subjects?

Yes, I realized that many of the subjects will prepare the students to learn and think logically, but there should be a balance between learning core principles and life subjects. My son is taking several Advance Placement (AP) classes because they are recommended when he applies for universities or colleges in 2 years. However, he has been putting late nights and spending every waking hours on weekends completing the homework and assignments. Needless to say that he hardly has time to get any exercise. Is this healthy?

After I graduated from high school, I applied to the college in the US and was accepted to Bernard M. Baruch College, CUNY and graduated with a Bachelor degree 4 years thereafter. As part of the requirements I had to learn English, algebra, civic, foreign language and even history. For business management degree, I had to take statistics. Statistic was perhaps one of the most difficult subject to master that the exam was an open book exam. After graduating from Baruch, I thought I had all the necessary skills to prepare me to get a job. I was wrong!

Photo by Pixabay on

When I interviewed for the several jobs I applied for, I found out that I was qualified at all. The interviewers were only interested if I graduated and if I had any work experiences. They never asked if I aced math or language arts. They never asked if I was good in science or civic. By the time I received the fourth rejection, I realized that college degree is not all that. I needed some real life experiences to allow me to find a job.

As I began to work full-time I realized that I’m lacking many skills and the higher education that I completed did not teach them at all. The course I took on Excel, as an example, was not advance enough to help me to do my job correctly. I had to create a template using macros by learning from my coworkers. Working with data was an eye-opening experience too. I never knew data can be altered, sliced and presented in trends to tell stories. None of them were in my college curriculum.

As I started to have a career, I began to explore what I wanted to do in my life. One subject that came up was if I am prepared when I retire. Then I realized that I do not have the knowledge in personal finance. Unless I am in the financial planning or work in the investments, I would not know what to do with my money. I had to learn everything using Google and read Kiplinger’s Personal Finance magazine.

Needless to say I was excited about the personal finance and the potential of earning more so I can retire comfortably. I shared what I learned with my kids and started to teach them how to be responsible individuals when it comes to finance. Then I realized that personal finance is not a subject that is being taught at schools – from primary to high school. The higher education such as universities and colleges do not even have this course as a prerequisite.

Let’s switch subject and move to technology. My son is attending local high school and commented that the computer science class is not teaching anything. He learns programming that a 5 year old can do. The teacher was not even experience enough on the subject. That begs the question, is this how we want to prepare our future generation? With the advancement in Artificial Intelligence, robotics and even smart systems, the education system is not doing enough to prepare them at all.

There are various reasons why the education system fails the students. Resources could be one, but they are usually out of our control because funding for schools are entirely federal, state and local governments control.

If I have to point one major reason for the failure, it will be lack of experience teachers. Majority of the teachers are career educators who have been teaching children all their lives. They follow the program and teach what they were told to teach. I would say all of them do not have real life experiences in the world so they cannot relate what they teach.

Here is one good argument on language arts, or in other words good command of English language. I took English 101 during my freshman year where we have to follow the structure in writing essay. This has been passed on from elementary school all the way to higher education. Here is the shocker – not all of the concepts we learn are applicable to the business world! The executives of a multi-national companies do not have time to read an entire page of “essay”. In business writing, we have to “boil the ocean” to summarize the activities of the entire quarter and year to just a few sentences for the executives to read.

The education system today needs a full overhaul to prepare our children for the future. Is there a silver bullet that could solve the problem. Yes, but it will require collaboration between educators and parents.

Let me know your thoughts. How is your education system in your country? Or do you agree with my assessments?

How to Manage Data – A Quick Guide

Data has become so important in our daily life that a lot of us do not realize how it controls our daily lives. Data can work for you and it can also work against you if it is not understood well or not properly handled. Data can be an important tool if it is used properly.

I work with all kinds of data all my professional life. I started assigning auditors to audit engagements, then I worked on financial and expense data. For the past 15 years I focused on control functions within banks and also on audit issues remediation. All the data that I worked on always ended at someone’s desk. I also realized that my work could be used by senior management to make important business decisions. One of the most important lessons I learned is in order for important decisions to be made the data must be complete and accurate.

I recalled the first exposure to data was a class I took at college on DB2. DB2 was a database application designed to work with a “flat” data. I was amazed that I could filter and sort the data. The rest was just a haze. Microsoft Access which was part of the Microsoft Office suite was revolutionary. I was exposed to the application in my second year at Arthur Andersen. I realized its potential and the found many uses for it in my subsequent years.

Fast forward 15 years, my work with data is more important than ever. My current work requires not only the data analytical skills, I have to make connections of different sets of data and be able to tell story and develop trends that could be easily understood by senior management. One of the biggest challenges in telling the story is incomplete data. In my line of work important decisions are made that could impact the livelihood of other employees. Therefore, the data must exist and factually accurate.

There are several challenges that every businesses must face to ensure the data is reliable. There are many solutions but this article is to focus on what if scenarios and the best way forward to ensure the data is just, reliable and accurate. Also, I will also touch on the role of Artificial Intelligence in using data.

Data Availability

The use of smartphones (do we even call it anymore?) the availability of data has multiplied ten fold. Additionally, the computing power has also multiplied over the past decade that anyone with a regular desktop can churn out enough information to operate a robot. However, for certain industry such as banking, data availability continues to be siloed. Using the bank, HSBC, that I work for, we need to be able to produce management reporting based on information that are generated internally. We can’t use data outside of the bank for various reasons, but mainly for relevancy. When data is not available, we need to create the data internally.

In the AI field, the issue is different. Let’s use Amazon, the largest retailer in the planet as an example. Amazon’s goal is to sell a lot more products to its customers. Prior to the big data revolution, it needed to be able to understand its customers and make suggestions that would entice them to buy more. To achieve that it, it would use data available in house and combined with external data to build a suggestive AI engine model. The AI engine will use both sets of data to build a list of products and display them on the buyer’s homepage. Nowadays Amazon is so big and so vast that it only relies on its own data.

Data Relevancy and Linkages

Most companies today generate tons of data daily. However, there is an important task to analyze how relevant the data is for its day-to-day operations. If the data is relevant, how does the company create relationship or linkages of the data? Using my current work experience as an example. Every employees in the bank are required to take required training annually. Employees are also encouraged to take training that are relevant to their work. How do we create a link of internal employee training with the retail clients? Are they relevant?

A real life example is the COVID-19 impacting how we manage the business. The pandemic has caused the Internal Audit function to revised the annual audit plan. Any changes to the audit plan must be approved by the oversight board, Audit Committee, and the changes must be communicated to the local regulators. Unfortunately, we can’t just “write” it off and hope for the best. My job is to understand the changes, review all past history, current business environment and if any of the risks can be mitigated. I would go through various data, review all hand written notes and produce a summary to the Chief Audit Executive to make sound judgement. Unfortunately, the task was manual because some of the data was incomplete and human judgement was necessary.

Data Organization

Having tons of data available is useless if the data is not organized properly. This is why we have Data Scientist today. Its role is to analyze the use of the data and how to organize the data in a way that it is readily available. If I have to guess, 90% of people who run reports from a preset system do not understand this concept. They will export the data to Microsoft Excel and crunch the numbers without thinking twice where the data comes from.

Data must be organized in a multi-dimensional way so they can be used in multiple ways. The data cannot be flat, like an Excel data-set. Data from multiple tables are linked using key fields. This allows the data to be molded, adjustable and sliced easily. This is where Microsoft Access, and other relational databases shine. Most of the data is saved in a SQL databases where the data can be easily retrieved and analyzed. Some companies created front end reporting systems that do all the hardwork where limited programming knowledge is required. For example, QlikSense and Crystal reporting are some that I am aware of.

Data Lifespan and Knowledge Transfer

Data must be kept for an undetermined lifespan. Regardless of how old the data is, it is a valuable asset. Data can produce trend analysis, provide deep insights and predicts future occurrences. One good example is predicting the weather pattern in the US. However, what is the value of the data it it is housed in a secure location that no one have access to? Proprietary data must be housed in a location that is properly secured and documented. Additionally, the data must be easily transferred should the need arise. Documentation must include location of the data, its intended use, data description and classification of the data. Without properly documentation of the data the knowledge of the data cannot be transferred.

Information Security

For the data to be relevant, all data either proprietary or used to make important business decisions must be protected at all cost. This has become so important that it is one of the subjects that the regulators from around the world are tackling. Data is a valuable asset that it is traded and available for sale in the open market. Everyday we hear about information being stolen by bad actor and sell it in the dark web. Hence, protecting data is paramount for any company. Cybersecurity has become an important field and subject that a lot of companies are investing in. How do you contribute to fighting cybersecurity? Ensure you are aware of the policies and procedures set forth by your company and follow common sense.

This blog is not intended to cover everything on how to manage data. Data is a subject that could take multiple books to write and most universities spend months by just skimming the subject. However, I wanted to touch on several subjects that are essential to ensure the data is relevant in the business world. And I hope that I’ve done that. Thank you for reading.

Importance of Data in Advancement of Artificial Intelligence (AI)

There is a misconception that Artificial Intelligence (AI) started in the past decade. It’s true in a sense, but the real AI started well before then. According to Wiki, the field of AI started in the 50’s. In fact AI can trace back to several centuries back to the use of automaton. The AI becomes one of the most important field today because of several developments: the availability of computing power and availability of data.

I am not an expert in this field but having to learn this subject in the past year, I was surprised to find out that my career background is somehow related to the development of AI. I started my career at Arthur Andersen and I was exposed to “Big Data” when I was responsible for financial reporting at Arthur Andersen. Before Arthur Andersen disappeared from the business world, it has one of the best data system – the Financial System of the 90 (FS90). I had the priviledge to work with Ralph Schonenbach, who is now the CEO of Envoy, in developing several tools for the Financial Control of Arthur Andersen.

I found the data owned by Arthur Andersen fascinating. With a complete data map, I was able to generate various reporting using Microsoft Access. Some of the tools that I created went on to become important integral part of management reporting. The experience I obtained from Arthur Andersen had helped me tremendously as I moved to Citigroup and HSBC where I continue to create different management reporting for the management and the US banking regulators.

The AI today is no different that what I experienced when I was Arthur Andersen. Essentially the AI uses huge amount of data to create trends, outlook and suggestions; from this the AI can harness the data to automate repetitive tasks. The AI has becoming more important as Big Data has becoming readily available. The explosion of smartphones also help fuel the AI as more and more companies found ways to collect data from smartphone users though the apps. For example, apps such as Spotify, Netflix and the ubiquitous Google Chrome collects terabytes of data every single day. Many companies, particularly Google saw the potential of the availability of data and started to monetize this asset.

The computers today has also advanced exponentially that they allow AI developers to be able to crunch data more quickly and efficiently. I remembered when I first bought my first PC in the early 1990s was using 386 Intel chips running at 60 MHz. Today PCs are running at teraflops – a teraflop is a unit of computing speed equal to one million million (1012) floating-point operations per second. Of course not everyone needs that kind of computing power for their everyday use. That’s where the computer hobbyist come into the picture with the development in micro-computers (Raspberry Pi and Arduino). Nowadays many companies in the AI business are talking about Internet of Things (IoT). In case if you are not aware, IoT refers to the interconnection via the Internet of computing devices embedded in everyday objects, enabling them to send and receive data. We are talking about from toasters to door bells.

Why is data so essential to the development of AI? I’m not a data scientist (which is a new field as a result of the explosion of AI). But I can tell you that without data, AIs are just dumb machines. Data enables AI developers to piece and stitch different sets of data together and generate a trend. And from the trend, the developers can generate hypothesis and create predictable analysis.

Let me explain. Before I was exposed to Microsoft Access, I used Microsoft Excel to do a lot of computing work. All financial analysis requires Excel. However Excel data is flat – meaning that what numbers you put in the formula will generate a known result. Microsoft Access database is different because it is called a relational database – essentially the database contains multiple flat tables interconnected through a relationship using key fields. From the relationship, Microsoft Access allows the user to create different results based on selected criteria.

The AI today is using the same concept but at a bigger scale. The data sets may not be even related to each other but the AI understands what the user is looking at and produce results that could be related. Let’s use Netflix as an example. When you first sign-on to the Netflix, the service asks you what genre of movies you like to watch. As you use Netflix more and more, the AI starts to build your profile more and more. It will begin to suggests some of the movies that you would prefer to watch. For example I have always been a WWII aficionado. When I first signed up to my Netflix account, I never told it that I want war movies. Over time it starts to suggest war movies, documentaries and even Sci-Fi movies that are war related.

The above example is on the software side. But what about robotics or hardware. When does data come into play. When I attended the AI Summit, I had the privilege to attend Lockheed Martin presentation on AI. I found the Automatic Ground Collision Avoidance System (GCAS) fascinating and how it saves lives. The pilots of fighter jets go through maneuvers that can produce g-forces strong enough to render a pilot unconscious or cause spatial disorientation.  The GCAS will kick in and automatically level flight and prevent the fighter jets from crashing into the terrain. The GCAS requires multiple data feed such as wind speed, aircraft speed, location of the aircraft, pilots responsiveness, historical data to determine when it is appropriate to take control of the aircraft.

Anyway, this is just a blog not a scientific paper to argue how data becomes so important in the AI field. I am not qualified to provide a view in this field. After being a champion of data quality and user of data for over 15 years, I can tell you that data is everything. Our lives are driven by data and they will continue to be driven by data. I won’t be surprised if we start to embed AI in our consciousness in the next decade or so. There are many opponents to this idea as it crosses the line of privacy – that is another subject for another day.

Is Artificial Intelligence (AI) the next big thing?

Earlier this week I attended New York’s Artificial Intelligence (AI) summit offered for free through my company. This event is meant for businesses who are interested in getting AI in the many uses of every facet of the company. The event was attended by over 5,000 “delegates” and various speakers. While I feel that the overall purpose of the event is informative, I did not find it that much useful. This is due to the fact that AI is not an easy subject to tackle.

That brings me to the title of this blog, is AI the next big thing? Over the century, there were tons of “next big thing”. The discovery of personal mobility, discovery of flight and the coming of internet to name a few. After being exposed to the subject, a developer of the “AI” and attended the many discussion of the AI – I find that the AI is an unavoidable subject that everyone is living in.

What is AI?

The AI is a broad subject that covers a wide range of automation. By just implying AI means robotics is incorrect. AI refers to automating tasks that we do every day. Going into deeper level, AI refers to making not only the tasks easier, but better. However, there are a number of risks involved. In this blog I’m not going into the subject that deep because I’m not an expert in this are.

What is AI, really?

AI is just pure “If and then” statement. In other words, it translates to cause and effect. In computer lingo, you program the machine to identify a statement or action. If the action is satisfied, what will happen next. Let’s use Alexa for an example. You can program Alexa through a routine to run the task of telling you a weather condition. “If” the temperature outside drops below 30 degree Celsius, you want Alexa to remind you to wear heavy jacket before going out to the cold. AI can be categorized in multiple categories. AI can cover front end use (i.e. applications etc), to Machine Learning (ML) and Deep Neural Network. For the sake of not confusing anyone, I will use AI to cover all these subjects.

My exposure to AI

Everyone of us are using AI whether they are aware of not. For example if you are reading this, I can assume that you already have a Netflix account. A normal user would not know that there are various AI running every time he or she opens the Netflix app. Netflix uses your past behavior and make future movie suggestion on your home screen. Additionally, Netflix uses AI to create thumbnails on the home screen. The biggest question is how did Netflix manages to do that in split second? I like everyone else are mainly user of AI. However, only the last year or so I realized that I’m a developer of AI, albeit in a small scale. I’ve been developing databases using Microsoft Access for over 15 years. During this time I crated over 50 different databases (or tools) to do things more efficiently. Additionally these tools were able to generate hundreds of different reports through conditions I built in.

Is AI easy?

Even though I’m an AI developer by definition, I would be lying to you by saying AI is easy. This is one of my criticism of the AI summit that I attended – all the speakers, delegates and the various booths at the conference seem to suggest that AI is a must have and easy. After I attended the bootcamp of Amazon Web Services (AWS) too months ago, I realized that AI is not as easy as advertised. Not only you need to understand computer lingo, you need to have a good understanding of programming. There is nothing “click” and “drag” in AI.

Should everyone work with AI?

The answer is yes. In one of the talks in the conference, the speaker mentioned that in the next several decades AI is so important that it will decide if the business will succeed or fail. Businesses who start incorporate AI in their business models will likely succeed and those who do not thing of AI today will fail (see above). That translates to the work force. If you want to succeed in the work force, you need to start thinking how to incorporate AI in your career. For years I’ve been satisfied staying in my “comfortable” spot and not worrying about my future. This is no longer the case as I see my other friends started to progress further while I feel “stagnant”. That’s why I’ve begun to explore this are a more and more.

What do you need to do now?

If you are currently in the work force, start investigating the subject on AI and how it will help you or vice versa. Start incorporating AI in your daily work. One way to do this is to look at your internal tools and processes and see if there is any option that you can make things more effectively. If you are parents with children still in grade schools, start encouraging them to learn about computer coding (particularly Phyton).

In the next several blogs, I will invest more time in discussing AI. If you have any questions of comments, feel free to respond to this blog.