Recipe for Cost Reduction – Outsourcing, Offshoring, RPA, and more

Industry observes that many different forms of cost reduction strategy in history. You can travel back in history as much as industrialization happened where big capital consolidated the manual industry which used to be spreader in each mom-and-pop business.

The more recent examples in the last 20 years include outsourcing, offshoring and automation.

In this article, I would like to quickly walkthrough the recent history of such cost reduction and discuss the never-changing issue we are all facing.


Business use to be more vertically integrated. There is a good reason behind it – the transaction cost between business was high enough to justify integrating all those services within one business. From upstream to downstream, business has full control over its service and quality.

There is a good argument against such vertical integration. Just like how a book The Goal by Eliyahu Goldratt argued, management started seeing the limitation how far they can optimize without considering the complication of business supply chain.

Here comes outsourcing. While transaction cost between business drops, many business started disintegrating business process and outsource them to such service providers.


For the further cost reduction outsourcing service providers and businesses started looking abroad – Eastern European countries like Poland for Europe, China and Philippines for Asia, Mexico for America, and India for all of them.

Those countries offer very cost competitive workforce, and they tapped on many cost-conscious business as their clients.

This offshoring serves its purpose for the last 10 to 15 years while their cost remains competitive. Now it is no more. Their living standards improve thanks to such foreign cash flow in from the offshoring/outsourcing work, their wage level also has risen. To those clients who offshored, the problems of cultural difference, training cost from high turnover rate, and loss of ownership are more distressing when the cost appeal vanishes.


In the last 5 years or so, consulting firms also started selling reshoring / backshoring. The end goal of such reshoring / backshoring is regaining the full control on once offshored business process, cost, and service quality. Although such moves were happening, I do not believe reshoring or backshoring never caught the marketing buzz as much as outsourcing and offshoring – until automation or RPA (Robotic Process Automation). Some business calls it as “digital”.

Such automation uses the very old technic which IT team used to call it as “screen automation”. IT team uses it to automate screen testing to save their testing effort. Now consulting firms and solution providers rebranded it as “RPA”, and voila! Everyone is talking about RPA as much as just like we did for outsourcing and offshoring.

The issues with automation again

The upcoming issues with automation are obvious.

  1. Rising cost – Consulting firm pitches that once implemented by consultants, the process can be customized and maintained by a business process owner (client) himself. Really? He could not even manage the offshored tasks – human to human customization, how can he maintain it the computerized task – human to computer.
  2. Lost of ownership – This happens when outsourcing and offshoring happened. Manager used to manage his team to own the process. Outsourcing/offshoring move the team to another business, and the manager is still obliged to own the process without full visibility and control. Such manager progressed his career as a subject matter expert, and he is now expected to serve more like vendor manager.
  3. Lost of expertise – Hence, the next issue is the losing expertise in team. Those SME or subject matter expert do not perform as good as the professionally trained vendor manager (many consulting firm graduates also progressed their career to vendor manager). Naturally those SMEs shy away from those team and the company would find out that their company is full of those manager who is good at only managing and no expertise.

These are nothing new and not particular to the automation. This is the history we are repeating every time.

What should we do instead?

The only one thing to do – address the root cause.

If there is something inefficient in the process, before you even think about outsourcing, offshoring, and automation, think about what would happen if we omit it.

Here is a very useful consultant’s toolset of process improvement:

  1. Eliminate – Eliminate the unnecessary tasks if it is not causing huge issue.
  2. Delegate – Delegate the tasks if it can be done or had better be done by someone else, i.e. outsourcing and offshoring.
  3. Consolidate – Consolidate the tasks to minimize effort overhead.
  4. Automate – Automate the tasks by leveraging tools and technology.

The numbers in the list above is not just for presentation, but everyone should consider each options in the order. The latter in the list, the more difficult it is to undo and change. So do you see where the automation and RPA is in the list? Everyone should think carefully before quick patching their process by automation. There is no magic pill.

Jupyter Notebook: deliverables-quality data analytics tool

Very often in consulting engagement, I encounter the situation that I have to wrangle a huge dataset with limited choice of tools. In typical client enterprise work environment, especially with non-technology team, they offers only the standard Microsoft Office suite plus possibly MS Access. This post is about my approach on how to build the deliverables ready document with Jupyter Notebook – the open source library for data analysis and visualization tool.

Here’s my story..

In one of my client engagements to estimate the impact of Brexit and Trump’s presidency election (and surprisingly both turned out to be true against many opposite bets) to the trading volume for futures and options. My client is a mid-sized bank who runs business in small footprint but captures the big market share with their very cheap cost of trading/clearing. Due to their attractive pricing strategy to their clients, many market making clients and cost-sensitive HFT clients sign up with my client. Because of the nature of their client profile, when the market becomes volatile, their business volumes triples or even quadruples, hence such acute rise in volume may hit some hidden bottlenecks in the bank’s business supply chain of processes.

Client team tried to solve this with MS Excel and Access, but both have limitation. MS Excel is, while its interface is intuitive and many people know how to use it, it lacks the ability to manage large dataset. Client machine’s physical memory (e.g. 8GB RAM) minus whatever reserved RAM for OS and other applications are the ceiling of the data size a user can mange data up to.

MS Access also has similar data size limit, and also we need to do clean up data through its query features. I often see the MS Access files that user built with 20-30 queries just to clean up. The layered queries is very hard to debug as it critically lacks the ability to track the updated data in each query step.

Now, I got the 3 data file for each business day, each file is about 1~2GB: One for Cash Equities, another for Futures, the other for Options. Each business day is around 5GB in total. I have to run my analysis based on the last training 6 months, i.e. roughly 120 business days. The grand total size of data set can be as big as 600GB, and there is no way we can analyze it on MS Excel or Access.

What Jupyter Notebook can do

Jupyter Notebook is an open source project which offers a web browser interface to analyze data with Python and many other languages. An user is expected to code in programming language to manipulate data, and the output of the data, if the user wishes so, can be displayed right below the executed codes. Here is what Jupyter Notebook can do:

Useful libraries

Jupyter Notebook originated from IPython Notebook, and as its name implies, it used to be the data analysis tool for Python programming language (now supporting other languages too). Since it is coming from Python world, it can use many Python libraries, such as NumPy and pandas. Both libraries offers data analytics toolset including data model, time series analysis, and series of statistics tools.

Authentication Included

The Jupyter Notebook now comes with the basic password authentication and login is required through browser. This helps me a lot when I am pitching this tool. I used to have to prepare some authentication setup on the previous IPython Notebook, but it’s been built in Jupyter Notebook. Thanks to this feature, I can suggest that I can even host the Jupyter service on the public cloud service with decent security rather than the client’s network where I often observe many limitations.

Hybrid of CUI & GUI

The output of the Python commands can be shown just below so an user can see the whole history of the commands and outputs altogether. This helps, especially, a non-programming background user who is allergic to the black screen. He/she can copy/paste, scroll back and force with mouth, etc.

Export to Python program

The built programs through Jupyter Notebook can be also saves as independent Python program, so it can be rerun through command prompt (Windows) / Terminal app (Mac).

The developed program is nothing particular to Jupyter Notebook. It is transportable and generic Python program, i.e. environment-agnostic.

Where to start, i.e. challenges for (especially) big enterprise

Start teaching programming to “non-tech” user

We need to start teaching users how to program so that he/she can code on Jupyter Notebook themselves. There are a lot of resource available online/offline. Many free open courses from university.

I observe through consulting engagements that the larger the organization is the more non-tech (e.g. Ops, Finance, Legal, etc..) are trying to get away from programming. Jupyter Notebook is not something where you can code with drag & drop program blocks. He/she needs to code in programs.

There are people who have done programming in VBA. If they can do with VBA, Python will be a lot simpler in a sense that there’s less thing to remember such as those MS Office Application specific object names

Review what “EUC Banning Policy” is for

Many corporate environment ban EUC or End-user Computing due to their concerns for the discontinuity of service when someone who developed the tool left. Also they have bad experience when the tools did not work according to what they expect. Corporate management often criticizes that it is because they let non-tech user those who are not trained to program; thus they ban those end-user programs such as MS Excel and Access tools with VBA and mandate all these programs need to be developed by IT team.

I here want to argue that then why not train business users properly. Those business users understand business requirements the best, not IT team. They themselves have good motivation to develop it because it helps their work.

Moreover, if we let business users to program through RPA or Robotic Process Automation, it will tie the organization to the RPA solution and will continue paying fees to consultants. It would be much better off to let teams learn something more generic and useful skill – programming.

If you want to know more of my experience

I consulted with banks on how to implement Jupyter Notebook in their operations. Some banks still use it through the Jupyter Notebook. Some other “graduated” to the separate Python program, and the Operations user is running and maintaining commands through command prompt to automate the business process to fetch market data.

If you want to find out more of how it worked, please email me at