#WomenInTech – Sigrid Rouam

From genomic sequencing to finance or FMCG, Sigrid has been working in data science for many years. She shares invaluable insights on building the right team, dealing with a pace of change in technology, or the constant need to educate organizations on data science. Buckle-up!

Hi Sigrid, what do you do and what brought you to Singapore?

My job is Head of Data Science at the Singapore Exchange (SGX). I came to Singapore 11 years ago to complete my PhD in Statistics, jointly conducted at A*STAR (GIS) and Université Paris Sud. After that, I did a post doc at A*STAR in cancer genomics before moving to the private sector, gaining experience across several industries, namely FMCG, Telco and Finance. Initially the plan was to stay in Singapore for 6 months but 11 years later, I am still here!

You started your career in biological sciences, worked in a few different industries and are now in finance. What did you learn from working in these different industries?

Working in different industries has given me a wider knowledge on how to use data science to solve problems under different contexts.

Genomic sequencing was my first exposure to big data. Processing large amounts of data is not necessarily what statisticians care about (you usually leave this to engineers or computer scientists), but being able to deal with increasingly larger datasets is inevitable in today’s world where the amount of data created daily grows exponentially. Genomic data give rise to multiple testing comparison problem which, in statistics, occurs when a large number of hypothesis are made simultaneously. In the case of genomics, the expression levels of thousands of genes are measured at the same time. The more inferences are made, the more likely erroneous inferences are to occur. The multiple comparison issue needs to be tackled in order to obtain meaningful and reproducible results.

Working in the largest FMCG company in the world taught me how to develop a customer-oriented approach. Everything I did, I did it with the customer in mind. The CEO used to always remind us that “the customer is the boss”.

In Telco, I dealt with geolocation data and how to productise and monetise it. I also learnt how to deal with personal data and data privacy.

When I joined the financial industry, I discovered a whole new world. Indeed, financial data – trading data to be more specific – is very time sensitive. Everything happens so fast that speed and latency are critical. Financial data analysis requires  specific infrastructure and tools. Dealing with time series also brings additional complexity because what happened yesterday can’t be treated independently from what happens today. This is totally different than clinical research where each subject can be treated independently from the rest.

Even though some data science skills are transferable across different domains, context plays a crucial role too. Clearly understanding the problem requires data scientists to understand the industry.  in order to gather relevant data and draw relevant conclusions. Data scientist can have very strong technical skills, but if they can’t interpret models correctly and tell a story about the data in their respective contexts, all their efforts would be in vain.

When you joined SGX, you had to build everything from scratch, what did you enjoy the most?

First, building my own data science team from scratch, and second, having a strong support from the upper management.

Building a team means addressing four key areas from my perspective:

People

We have adopted a hybrid approach with a core data science team that reports to me and what we call super users from  different business units. The core team is built with diversity in mind. There is a mix of people with expertise in stats, computer science, engineering and finance. Because the team is quite small and the amount of work is large, we have trained the super users in coding. These super users help us do simpler tasks such as data retrieval, dashboards, reports tailored to their BU needs, and also gather requirements from their teams back to us. This enables us to have more bandwidth to do more advanced analytics.

Processes

Sound scientific principles require a set of best practices to guide data scientists in their daily activities. I created the SOPs and pushed to  adopt agile practices. Finally, the team has defined a set of analytics deployment processes, which are steps to deploy code from a development environment, to a testing environment (on real data), and finally to a production environment ( deployment to end users).

Technology

I spent the first year building a Machine Learning Platform which enables data scientists to run analytics at scale.The platform allows to process large amounts of data using parallel processing. It supports the most common coding languages used by data scientists which are R and Python. It is also satisfied all the security and compliance requirements.

Use-Cases

A big part of my role is to define use cases. In the beginning, the team built easy use cases, to prove  the value of data science and fund the analytics journey. Now that the team is bigger and processes mature, we can spend more time on both exploratory work (which aims at bringing innovative technologies to the exchange) and delivery mode (which works very closely with the different business units in order to cater to their respective needs, e.g. better consumer understanding, tasks automation).

Our company organization gives my team the unique opportunity to interact with the CEO and President. Their vision for the future is truly inspiring and acts as a real enabler for our team’s development.

The success of data initiatives in a company largely relies on the CEO’s support. It is critical to increase the adoption of data oriented mindset across the company. Creating a data team from scratch often implies  changing processes,, organization chart and the culture. If the CEO doesn’t buy in into the data transformation journey, data science team tends to spend more time convincing the organization than doing its job. . Data science is not all about technology, it is also about communication and change management.

After few experiences now, what main challenges do data leaders face in their role?

There are two main challenges

  • Convince people about the importance of data science for the company and get rid of misconceptions around data science
  • cope with a constantly changing environment where technology evolves very quickly.

On the first challenge, I adopted two key strategies.

First, we have to educate people. I organized sharing sessions on specific topics (e.g. what is AI/ML), demos on specific use cases as well as trainings on statistics, python and q (the coding language we use in the company to extract data from our database).

In addition, I put a lot of effort in building a data-driven culture, i.e. leverage data whenever and wherever possible in order to help business units to take  better decisions. One initiative that I started recently was to build a data science community in order to promote data and ideas exchange, encourage cross-disciplinary work, and brainstorm new ways to look into data.

To cope with a constantly changing technology landscape, I practice continuous learning. I love being challenged and acquiring new skills, be it technical skills, soft skills or business understanding. Learning is like playing a sport, the more you learn, the better you are at it and the more addicted you become to it too! When I moved to SGX, I had to pick up financial knowledge as well as a q programming language. Three months later, I was training people on q. The important thing is to stay humble without being afraid to start from scratch again and again.

As more and more data is used to train the Machine Learning algorithms, AI has made tremendous progress over the last few years. How can we ensure that AI is used for good?

In order to ensure that AI is used for good and in order to prove Elon Musk wrong when he says that AI is evil, we need to start thinking on how we can build AI in a sustainable manner.

In order to build a sustainable AI, i.e. an AI that is here to last, there are four aspects to keep in mind: AI should be inclusive and fair, ethical, responsible and explainable.

To be inclusive and fair, AI should benefit to everyone and no one should be left behind. AI should not discriminate and we should prevent biases. Private companies have made some progress and, for example, Google images no longer tags black people as chimpanzees.

An ethical AI means finding a balance between privacy and common good. Where should we stand between a world where nobody shares any data and where we know everything about everyone? If we don’t share any data, it becomes hard for models to be fair as they are not trained with a representative set. At the same time, if we reveal too much, our freedom may be impacted and the data we share may be used against us.

In order to build a responsible AI, our biases need to be properly managed. There are two types of biases to manage: biases we are conscious about and, more importantly, biases aren’t. Having a third party looking at the data or model built by data scientists would be of great help, what I call a “data psychologist” that, similarly to patient seeking the advice of a psychologist, would ask various questions around them to ensure that no biases are introduced.

Finally, AI should be explainable and transparent. For example, when a bank builds a credit scoring model, it should be able to explain why it is approving or rejecting applicants. More importantly, the decisions to accept or reject application should be consistent over time. As more and more data is used to train the Machine Learning algorithms, there is a risk that the final decision may change. If models and decision making processes can be explained, these problems are more likely to be avoided.

In order to ensure that AI is used for good and in a sustainable way, we, Data Scientists, should keep these four principles in mind when building Machine Learning algorithms. We can play a key role in ensuring that data is used appropriately and responsively.

What are the challenges of being a woman in data? How “Dare you” to be a woman in data?

People often ask me this question. Working in a male dominated environment can be very intimidating for some women and they want to know how I deal with it. Being in Technology, especially in the finance sector, I have always been one of the few women (this has been so since my Masters in Statistics). It is not just about knowing your domain very well, I think the main issue is related to self-confidence. Being self-confident is key if you want to convince people, influence and inspire them. People naturally tend to listen to people they respect or that inspire them.

Here are a few tips I often share with other females I mentor.  First work on aligning your message with your body language. Always get a sit at the table; don’t try to hide at the back of the room in the shadows where no one can see you. In meetings, don’t hesitate to voice up and join the conversation. People can’t read minds but understand words. Market yourself. What is the point of building the best analytical tool if nobody knows about it? Think about yourself a bit more and don’t be afraid to say no. Finally, get mentors (both male and female) to understand where your blind spots are.

How do you bring more diversity?

I have built a data science team based on talent diversity. All the team members have very different backgrounds, different cultures and complement each other very well.

I also encourage individual training, exchange of ideas via various means both offline (workshops, team bonding events, sharing sessions, stand ups) and online (chats, intranet, shared spaces).  Thanks to their complementarity, they constantly learn from each other and seek each other advice.

Our Technology division is also very diverse in terms of skills as well as gender. We have several women leading different teams. Our head of technology is also a female.

In data science, however, there is a clear lack of female talent. Most female I interview want to join the healthcare sector or smart city initiatives.

As a constant effort to bring in more diversity in the team, I am working hard in getting ladies on board.

Diversity in genetics allows species to evolve and adapt to changing environments.  In the same way,  diversity in the workplace is vital for organisations to stay relevant in a constantly changing world.

#StartupOfTheWeek – BeeBryte

BeeBryte is a French startup focused on a problem we all care about: energy consumption in buildings. They have developed a solution based on artificial intelligence to improve just that. Interview with Elodie Hecq, Managing Director Asia.

What’s the story behind BeeBryte?

BeeBryte was founded end of 2015 in Singapore by Frédéric Crampé and Patrick Leguillette.

Frédéric had been in Singapore for 15 years; years he spent successfully founding and running Cleantech investment banks. Seeing that the wholesale prices of electricity are hugely volatile and highly correlated with CO2 emissions: the cleaner the energy the cheaper, he saw an incredible opportunity. What if we could shift the time when you buy electricity and the time you actually have to use it?

He contacted a friend from his engineering school, back in France. Patrick was working in the hybrid energy industry: wind or solar combined with traditional sources and had a strong background in software engineering – he was immediately enthusiastic about the project.

This is how it all started, with the original idea to via software control a battery to store energy at the right moments, here in Singapore. We patented our first optimization methodology in 2016.

However, the batteries are still quite expensive and not financially viable in most countries. We know it will come and it will be big, but as from 2017 we decided to shift our focus. We saw another opportunity to address a more pressing need: better managing the energy consumption in existing building. Most of it comes from heating or cooling, and obviously in Singapore we focus on Air Conditioning systems.

In addition to that, this is a much more sustainable business, which we’re all very happy with.

So how exactly do you do that?

In short we use artificial intelligence to help office building or factories consume electricity in a smarter, more efficient and cheaper way.

We have built a software-as-a-service with a patented real-time optimization methodology, self-learning models and predictive analytics for dynamic energy management. This platform runs in the cloud.

Then we install a simple IoT Box at the customer’s premises, and we can control flexible electric equipment (such as heating-cooling systems, electric vehicle charging stations, battery storage systems etc.). We focus on the equipment whose operation can be shifted in time without any perceptible impact on comfort or service rendered. The beauty of it is that it can be deployed on almost any cooling systems already deployed.

The cloud platform anticipates the energy needs by analyzing their historical correlations with business activity, weather, building occupation, etc. An optimal control strategy is calculated as often as necessary and the corresponding instructions are sent to the equipment via the Box.

What customers love is how we adjust the A/C with the weather forecasts. This makes us really different from any player, since we don’t just do a static and reactive control. Typically in Singapore, you actually can know a bit in advance when it is going to rain. Just before it’s hotter, whereas when rain comes the temperature will fall down naturally. So you can actually adjust the A/C before it rains, let the rain cool the building down, and then turn up the A/C again afterwards. The idea is simple – the tough part is putting it in place – and our Customers love it!

Where does BeeBryte stand as a business today?

We opened offices in France in 2016 and focused on our product. Most of our technical team was based there and focused on artificial intelligence.

Then we started commercializing our product in 2017 in France and Singapore. In Singapore our biggest Customers are DHL and Ngee Ann Polytechnic, and we work with many office buildings, factories, warehouses, museums and universities. We have a foot in 43 buildings as of today!

And in 2018 we raised SGD 4M. This enabled us to recruit a lot of people, many of them in Sales. The team doubled in a few months and was re-balanced between both countires: we now have 15 people in France and 10 in Singapore. Our solutions were mostly ready so the key objective is to strengthen our position in the market in France and Singapore, and grow in UK, Germany, Italy and Malaysia. Then in 2020 we’d like to expand in Australia: this is a very promising market for us but obviously setting up an office there will be costly so we don’t want to rush.

What is in the works in the future?

We were looking for a real partner, not just any VC -and we found it with the CNR -Compagnie Nationale du Rhône. By the way our headquarters are now in France, to be closer to them.

They are the biggest producer of electricity from renewable sources in France. Interestingly they also have a small electricity provider business with which we are working very closely.

In Singapore our positioning will remain as it is today, focusing on the value we bring. The residential market opened recently – in which BeeBryte is not interested – and suppliers spawn from 10 to 25 in the last year. All of them compete on price today, but we believe the winners will be the ones who bring a real, different value. And that’s where we come in…

Most consumers today don’t have access to the wholesale market, so they buy electricity at a determined price. The provider takes a margin to secure the potential peaks.

Hence working with a provider, we could control the loads and the consumption, meaning lower prices and risk for the provider since we balance the peaks, and as such lower prices for our customers. It’s a holistic way to manage your electricity. The objective is to get this offer out in Singapore by the end of 2019.

What are the drivers for customers to buy your solution?

There are really 2 aspects. Some companies are very focused on cost reduction, but the big MNCs are more and more interested in reducing their carbon footprint by 2030. In those cases, we have a lot more discussion with the sustainability teams than operational ones.

You can see some examples of benefits below:

If you want to know more

The API mindset: helping finance enter the programmable economy

More and more companies across industries understand that APIs (Applications Programming Interfaces) are the way to re-align IT and business.  They can increase agility while reducing time from ideation to innovation, and they can consistently deliver value in a digital world. This is even clearer in Fintech, where regulators now require financial institutions to provide open APIs to build an ecosystem and facilitate innovative services.

For the last 10 years, startups like Stripe have demonstrated that APIs are the next distribution channel for the programmable economy. Get this right and you too could be worth $22.5Bn in less than 10 years. If your interface is well designed, most of your customers will likely integrate your services autonomously in a totally self-service manner.

Regulations now force organizations to open APIs and investors support new generation banks and services built on top of them. In 2017, this new ecosystem already represented more than $100Bn in funds raised and almost $900Bn in valuation. Think about the neo-banks like Revolut, N26, and Starling Bank, who bring back value to the end customer.

The corporates’ legacy becomes their burden

Corporates dream of innovation and disruption and talk about it all the time. Truth is, young and agile startups are really in the best position of doing it. Marc Andreessen, one of the most iconic venture capital investors in Silicon Valley, said it in 2011: “Software is eating the World”.

Applied to the fintech world, this means that the best companies of tomorrow won’t be banks who do software, but software companies who do banking. Preceded by the idea of Chris Andreessen in 2008 “The end of Theory”, companies who understand software and data don’t need to be expert in their business to be good. They will learn so much more, so much faster, with so much more data and interaction that in the end, they will perform better than companies who have been specialists for a long time but who rely on their legacy.

If software is eating the world,  APIs are eating software

In a distributed, connected world some companies understand that they don’t need to just adapt to APIs, they need to adopt them at the core. As Steven Willmott says: If software is eating the world, APIs are eating software”.

Inspired by tech giants whose growth engine is their API strategy, some companies embrace new internal governance models, integration partnership strategies, and ultimately monetization models. Examples include BBVA, Capital One, Allianz, but also Airbus, Lufthansa, or Walmart.

Hopefully regulation forces laggards to move and open up APIs. It began with the banking industry in Europe when Open Banking UK and PSD2 regulations required integration capabilities between banks and third party providers, and has been followed by Australia, India, Singapore, Hong-Kong, Mexico, South Africa. Canada and the USA are next in line.

Other sectors are also impacted, like the healthcare industry. The FHIR and HR7 requirements help providers manage medical records. In logistics and supply chain, shared API standards are emerging to compete with Amazon.

Bringing the API mindset to corporates

APIs is a mindset, more so than a technology. Corporates must put them at the core of their strategy to enable new business models. Technically it requires new skills and roles like API product manager and APIOps. But the most important and first step is culture, with the intent of developing an API mindset.

The API mindset consists of organizational changes, strategy, and practice to align IT and business around APIs, while respecting a technical and business contract and enabling the ecosystem to play its part.

The API mindset means that every service should be externalizable, as Jeff Bezos puts it. As he shared to all Amazon employees in 2002, services should be designed from the ground up to be externalizable via APIs that developers will love.

The API mindset is about thinking API-first, building the API before the website or applications so that an organization can deliver the same service to all channels via a unique interface. This is how you think customer experience first, rather than product delivery.

The API mindset is about thinking beyond just API facades to define an interface contract that if respected will enable a company to refactor technical debt, and decouple the monolith into smaller services without losing customers. Because at the end of the day, respecting the interface contract is the technical and business promise of the API economy. Like Werner Vogels, CTO of Amazon Web Services, says: “code can change but APIs are forever”.

The API mindset is about thinking of API-as-a-product, designing and building APIs to be delivered and implemented internally to increase business breadth, and monetized in a business context.

The API mindset is about building a strong developer experience and creating a powerful API design. These are the mandatory skills to enable application builders to build great user experiences on top of your platform.

The API mindset is about continuous management of your APIs: to always make better decisions by knowing exactly what service is used, by who, where, and how many times, in order to monitor and drive your architecture and your business,

The API mindset is not about a technology or an architectural style or an IT project. It is about leveraging internal energies to make companies think bigger, deliver faster, and be more resilient. This is the only way for corporate giants from all industries, to resist disruption and adopt digital transformation internally and externally. Because agility is what drives businesses in the digital world of today.

About the author

This article was written by Mehdi Medjaoui, co-founder of the APIdays conference and co-organizer of the upcoming APIdays Singapore on April 23 and 24 at the Arts House (old parliament of Singapore). Mehdi is also founder of OAuth.io (acquired) and APIs expert for the EU Commission, Pr. at HEC MBA, and Chief API Economist.

Over 2 days of conference with international and local speakers from the API community, APIdays Singapore will explore this API mindset, the technical and business aspects of the API thinking and how it applies to the fintech-as-a-service ecosystem.

Securing the API Economy

If you are a business owner (CEO, CIO, or head of a business line) you continuously need to look for ways to innovate, out-think and out-maneuver your competition. In the sharing and collaborative economy we live in, you have an unprecedented opportunity to team up with others to make what you’re good at even better, even more compelling, or part of a larger value proposition. Never before in our history has it been simpler to do so.

The argument is this: there are capabilities that YOU offer and deliver better than anyone else. As a result, you want to make full use of previously isolated data sources and make your capabilities available to others. To do so, you need to plug yourself more into the fabric of the “API economy.”

The term API (for Application Programming Interface – a program calling another program through its API) has been around for a long time; however, over the past few years there’s been an increased interest in APIs, more specifically “Business APIs” or “Web APIs.” “Business APIs” are pretty simple to understand: they are interfaces focused on business assets – for example, a product, a client, an order.  

The “API Economy” relates to the use of “Business APIs” to positively impact a business or a government agency.  API initiatives focus mainly on business drivers related to:

  • innovation
  • faster time-to-market
  • improved sharing of assets
  • the creation of new revenue streams by connecting the physical world with the online world (omnichannel strategy), and by addressing new clients/industries/geographies/use cases
  • improved client experiences

Increased employee engagement is also a focus area for many organizations.

Business APIs are making it easier to integrate and connect people, places, systems, services, data, products, things and algorithms. All industries, all verticals, and corporations of all sizes can extract value out of their data and connect with others – not only tech corporations. APIs now allow any business to embed itself into other folks’ business in an unprecedented way.

Random examples of the API Economy at play include:

  • Google providing Google Maps, which a retailer or ride-sharing company can plug into its applications without having to build its own mapping system
  • PayPal or Stripe payment systems integrated with B2C apps
  • Using your Facebook account when you sign in various apps
  • Brick-and-mortar retailers developing ecommerce channels and leveraging backend APIs to handle things like payments, shipping, etc.
  • IoT technology-enabled pest control systems with automatic notifications of caught rodents in Wi-Fi connected snap traps.

The API Economy’s value is in the trillions of US$, per various industry estimates: https://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/what-it-really-takes-to-capture-the-value-of-apis

In fact, an interesting industry survey suggests that more than a third (35%) of enterprises generate 25% or more of their topline sales from APIs:  https://www.mulesoft.com/press-center/technology-trends-2018-connectivity-benchmark. An astonishing number!

But here is the downside: as we transition into an increasingly digital-first environment powered by the API Economy, fraud actors follow the data, simply because data (after human capital) is one of the most valuable assets a business has. And APIs are the key to that data.

If your API is insecure, if your workloads or your users’ online browsing or identities get compromised,  you open up a threat vector into your business AND your ecosystem of partners.

Bottom line: When business leaders and developers connect disparate data together and core transactional systems are made available publicly, this increases the attack surface for malicious actors who can now infiltrate entire ecosystems through their supply chains.

How to mitigate your risks in the API Economy?

As a progressive business leader who is winning in the market by leveraging partners’ ecosystems, the last thing you want is for fraud actors to steal your confidential or regulated data or your financial assets.

As always in IT security, you must adopt a three-pronged strategy to minimize risks and boost your cyber security posture:

  1. People – continuous user education and awareness so that your employees truly become a “human firewall” and can spot a phishing email a mile away
  2. Processes – there are good practices aplenty around regular data backup, patching and incident response (beyond the scope of this blog post)

The best technologies will not secure your business from malicious actors if you deploy and configure them wrongly. The recent SingHealth breach in Singapore proves that no matter how advanced your security tools, if your people or processes “break,” you’re in for trouble and for receiving a lot of unwanted attention in ways that will impact your reputation or revenue or both. https://www.zdnet.com/google-amp/article/employees-sacked-ceo-fined-in-singhealth-security-breach/

  • Technology.

If you intend to be an active part of the API Economy and provide your APIs to others, you will be the target of security breaches if you don’t properly think through versioning and deployment. Start by securing your APIs with an application services governance framework – which caters to the end-to-end governance for all types of network services. A good starting point for your research is the 2018 Magic Quadrant for Full Lifecycle API Management by global research and advisory firm Gartner.

Additionally and importantly, if not done already, you must secure your workloads (whether your applications or services reside on-premises in your traditional IT infrastructure, or off-premises in a public or private Cloud, or in a hybrid IT model), and your employees’ identities. Technologies from WatchGuard around network security and multi-factor authentication will certainly help you achieve this important aspect of securing the API Economy.

Sylvain Lejeune, RVP APJ WatchGuard Technologies

linkedin.com/in/sylvainlejeune

How to defeat Malicious Everything-as-a-Service

In the sharing & collaborative economy we live in, we are witnessing two major trends at play.

First, an increasing number of people are getting online. Recent statistics suggest that 4 billion people around the world are now using the internet (this is half of the world’s population): https://wearesocial.com/blog/2018/01/global-digital-report-2018

Second, the consumerization of IT. Business leaders and lines of business are increasingly consuming IT services from their own IT department or directly from public cloud services providers (a trend called “Shadow IT”) on a pay-as-you-consume/PAYG (Pay-as-you-Go) basis. This IT-as-a-service framework has a few fundamental attributes:

  • Standardization
  • Automation
  • The availability of a catalog of services (the “service menu”)
  • Orchestration
  • A business and charging model based on consumption/PAYG
  • Self-service capability

We are now living in a demand-driven model vs the old supply-driven model which was focused on the available legacy technology and its constraints.

The winners in today’s super competitive markets are those that can out-think and out-maneuver their competition. They do so by leveraging a self-service-based operating model based on a high degree of standardization and automation, increasingly with a consumption-based business model (PAYG).

As a result, tech is increasingly present in every single revenue stream.

And bad actors have followed suit. They are leveraging the aforementioned trends to pocket large financial benefits. They are making malicious code and attacks available to the masses as “kits” which can be consumed as-a-service off of service menus built on highly automated and scalable architectures. Add all the stolen data to the mix and you have a very powerful (and daunting) value proposition.

It is very easy, cost-effective and fast now for malicious actors to modify hashes and create new malware variations that evade signatures. Hence the massive amounts of malicious code out there. More on this later.

Examples of “Malicious Everything as-a-Service” abound

Phishing attacks. There are now phishing kits available for sale. They comprise phishing website resources and tools that need only be installed on a server. Once installed, all the fraud actor needs to do is send out emails to potential victims. Email addresses of potential victims are available on the deep web – just like phishing kits.

Ransomware-as-a-Service, or RaaS, are ransomware distribution kits sold on the dark web for a few hundred dollars that allow malicious users with little technical skill to attack relatively easily. Some of these kits allow fraud actors to create their very own customized version of a given ransomware, e.g., Satan, with a “profit-sharing” business model (e.g,, the RaaS developer takes a 30% cut of any payments made by victims, the attacker pockets 70%).

DDoS attack tools are also easily available. A simple web search reveals a significant number of booter and stresser services openly advertised which give unskilled individuals the ability to launch significant DDoS attacks. 2016 marked a turning point with the Mirai malware, which triggered DDoS attacks originating from botnets of compromised Internet of Things (IoT) devices. A series of devastating attacks from the Mirai botnet struck a number of high-profile targets. Variations of the Mirai malware are still active today. More details at https://en.wikipedia.org/wiki/Mirai_(malware)

One of the most active services for launching distributed denial-of-service (DDoS) attacks, WebStresser.org, was taken down in April 2018. The service had more than 136,000 registered users, and it is estimated it contributed to millions of attacks over a three-year period. All of this for a mere 15 euros/month for users to carry out devastating attacks.

In all three aforementioned examples, phishing kits, Raas and DDoS attack tools, the business model, automation, standardization, service menu and the self-service capabilities are five attributes which closely align with IT-as-a-Service and the collaborative economy we mentioned earlier.

An avalanche of malware, compromised URLs, DDOS attacks

The phenomenon of “Malicious Everything as-a-Service” and the rapid growth in the volume of available highly standardized kits have led to a deluge of malware, cryptomining software, compromised URLs, DDoS attacks (in the wake of Mirai), etc.

As briefly mentioned earlier, it is easy and fast to create new malware or mutate** existing ones to evade detection. Today’s malware threats are far more advanced and prolific than ever before. Modern malware creation is automated. As a result it requires very little effort for attackers to mutate a piece of malware. [**Mutating malware is the process of changing existing malicious software without altering its functionality. This is often performed by changing a piece of malware’s hash. Mutation allows malware to evade signature-based anti-malware solutions such as your traditional antivirus.]

The case for man and machine working together

The rapidly increasing volume of advanced, evasive cyber threats is triggering the urgent need for traditional human involvement in addressing IT threats (through the provision of signatures, white-listing, black-listing, heuristics, etc.) to be augmented by the immense capabilities of artificial intelligence. In particular, it is the ability of machine learning and deep learning models to deal with vast data sets – an ability that humans simply do not possess.

Machines and algorithms bring automation, quicker response times, reduced error rates and pre-execution capabilities to the table. It is all about processing and analyzing large amounts of relevant data, and scale.

Human analysts bring human insights at two critical levels: once the AI models have sorted through data, human analysis can then take over and look into suspicious patterns of activity to confirm whether or not these are actual attacks or false positives.

That human analysis then feeds back to the machine learning models (e.g., by adding another layer of security or by continuously sorting and adjusting a mix of supervised and unsupervised machine-learning models, or a combination) to improve pre-execution outcomes and future predictions.

This is the power of man and machine working together to address the increasingly automated, standardized production of “Malicious Everything” delivered as-a-service to wannabe hackers who are flooding businesses, government agencies and consumers with compromised websites, DDoS attacks, cryptomining software and malware of all sorts.

Sylvain Lejeune, WatchGuard RVP Asia Pacific & Japan