finance
monthly
Personal Finance. Money. Investing.
Updated at 09:12
Contribute
Premium
Awards

F-Secure’s Cyber ‘Threat Landscape for the Finance Sector shows that the sophistication of adversaries targeting banks, insurance companies, assets managers and similar organizations can range from common script-kiddies to organized criminals and state-sponsored actors. And these attackers have an equally diverse set of motivations for their actions, with many seeing the finance sector as a tempting target due to its importance in national economies.

The report breaks down these motivations into three groups: data theft, data integrity and sabotage, and direct financial theft.

“This is a useful way to think about cyber threats, because it is easy to map attacker motivations across to specific businesses, and subsequently understand to what extent they apply,” says F-Secure Senior Research Analyst George Michael. “Once you understand why various threat actors might target you, then you can more accurately measure your cyber risk, and implement appropriate mitigations.”

Data integrity and sabotage – where systems are tampered with, disrupted or destroyed – is the cyber criminals’ method of choice. Ransomware and distributed denial-of-service attacks (DDoS) are among the more popular techniques used by cyber criminals to perform these attacks.

Similar attacks have been launched by state-sponsored actors in the past. But these are less common and often linked to geopolitical provocations such as public condemnation of foreign regimes, sanctions, or outright warfare.

And while North Korea has the unique distinction of being the only nation-state believed to be responsible for acts of direct financial theft, their tactics, techniques, and procedures (TTPs) have spread to other threat actors.

According to Michael, this is part of larger trend that involves adversaries offering their customizable malware strains or services-for-hire on the dark web, contributing to a rise in the adoption of more modern TTPs by attackers.

“North Korea has been publicly implicated in financially-motivated attacks in over 30 countries within the last three years, so this isn’t really new information,” says Michael, “But their tactics are also being used by cyber criminals, particularly against banks. This is symbolic of a wider trend that we’ve seen in which there is an increasing overlap in the techniques used by state-sponsored groups and cyber criminals.”

In addition, understanding cyber threats relevant to specific organizations is crucial to being able to detect and respond to an attack when it occurs.

“Understanding the threat landscape is expensive and time-consuming,” says Michael. “If you don’t understand the threats to your business, you don’t stand a chance at defending yourself properly. Blindly throwing money at the problem doesn’t solve it either – we continue to see companies suffer from unsophisticated breaches despite having spent millions on security.”

As a result Jason Lin, CFO at Centage Corporation says CFOs are losing sleep over the end result. This is so far from ideal, which is why I’m offering these five recommendations to help financial teams sleep better.

1. Instill confidence in your data

I totally get why finance teams lack confidence in their budget data. Last year’s actuals are typically re-keyed into a budget spreadsheet, and manual data entry inevitably leads to mistakes. Worse, it’s incredibly difficult to spot where, in a series of spreadsheets linked together with macros, a zero may have been left out or numbers were transposed. And once the data is entered, it’s used for what-if scenario planning -- i.e. predicting the future -- which takes the budget even further away from the “truth.”

Finance teams can get a lot more sleep if they ditched the spreadsheet and replaced it with a tool that can pull data directly from their GLs. Not only will the data be accurate (and teams spared countless hours of data entry), the budget will be a replica of how the business is organized, making scenario planning a lot more accurate. Of course, the predictions may still be wrong, but at least the effects of those assumptions on the financial statements will be realistic.

2. Avoid forecasts that have major variances versus actuals

This is a tough one because there are so many external variables that can affect the actuals. What will the economy do? Will interest rates go up? Will new tariffs drive up manufacturing costs? How is that upcoming election going to shake out? In all honesty, attempting to predict market conditions in Q4 2020 in the summer of 2019 is a bit unrealistic. No amount of effort will change that reality.

My best recommendation: move to a rolling forecast that’s updated monthly, or at least once a quarter. Not only will it lessen the variances, but it will also allow teams to spot trends that have the potential to affect the goals set (positively or negatively) much earlier.

3. Test your assumptions for accuracy

I realize what a big ask this recommendation is. This issue of testing your assumptions for accuracy will never go away because, as mentioned above, there are way too many factors that affect performance but are way outside of your control.

While you can’t control what will happen, you can anticipate potential variances and put plans in place to respond to them. Scenario planning and what-if scenarios are your saving grace here. For instance, you can test the impact on your P&L if sales decrease by, say 10%, or if the cost of oil spikes. You might not like what you see, but at least you’ll know ahead of time the potential outcomes so you can warn the executive team upfront, and make contingency plans if your assumptions aren’t correct.

4. Meet your budget deadlines and be boardroom ready

When I hear the concerns of CFOs about meeting deadlines I like to tell people what Steve Player, noted business author and Program Director for the Beyond Budgeting Round Table (BBRT) North America, has to say about it. To paraphrase his viewpoint: starting earlier is a terrific way to build more errors and delays into your budget. Again, in the summer of 2019 you are attempting to predict what Q4 2020 will look like. Do you know the outcome of the 2020 election? Do you know whether we’ll continue to see massive flooding in the South and Midwest? How will either of these events affect your actuals?

The solution is to shift your focus to a continuous process. If you believe in planning, why not do it monthly? It makes no sense whatsoever to start earlier and earlier when it’s not humanly possible to predict what the world will look like 18 months from now.

5. Break down your company silos

It shouldn’t come as any surprise that when budgets are created in silos, they won’t mesh with one another. Marketing will spend the summer months coming up with campaigns to launch the following year, while sales will review their customer and prospect pipeline and make their own plans. There is no connection between the two.

Financial teams have two options to address the issue of silos. First, implement a collaborative budgeting tool so that teams can see how their plans affect one another. If sales is pinning a revenue number of an increase in new SMB logos, marketing needs to know that, and to allocate part of their budget for an SMB customer acquisition campaign. Second, view this as an excellent opportunity to take a more leadership, hands-on role in the business. Bring the two teams together, and help them to create a tighter plan.

I realize that some of these suggestions can seem blasphemous; finance teams have always created budgets, stayed in the back office, and put stakes in the ground in terms of assumptions. But given the pace of business change, the old ways aren’t cutting it anymore. These tips reflect the reality of business planning today.

This week Finance Monthly hears from Caroline Hermon, Head of Adoption of Artificial Intelligence and Machine Learning at SAS UK & Ireland, on the adoption of open source analytics in the finance sector and beyond.

Open source software used to be treated almost as a joke in the financial services sector. If you wanted to build a new system, you bought tried and tested, enterprise-grade software from a large, reputable vendor. You didn’t gamble with your customers’ trust by adopting tools written by small groups of independent programmers. Especially with no formal support contracts and no guarantees that they would continue to be maintained in the future.

Fast-forward to today, and the received wisdom seems to have turned on its head. Why invest in expensive proprietary software when you can use an open source equivalent for free? Why wait months for the official release of a new feature when you can edit the source code and add it yourself? And why lock yourself into a vendor relationship when you can create your own version of the tool and control your own destiny?

Enthusiasm for open source software is especially prevalent in business domains where innovation is the top priority. Data science is probably the most notable example. In recent years, open source languages such as R and Python have built an increasingly dominant position in the spheres of artificial intelligence and machine learning.

As a result, open source is now firmly on the agenda for decision makers at the world’s leading financial institutions. The thinking is that to drive digital transformation, their businesses need real-time insight. To gain that insight, they need AI. And to deliver AI, they need to be able to harness open source tools.

The open source trend encompasses more than just the IT department. It’s spreading to the front office too. Notably, Barclays recently revealed that it is pushing all its equities traders to learn Python. At SAS, we’ve seen numerous examples of similar initiatives across banking domains from risk management to customer intelligence. For example, we’re seeing many of our clients building their models in R rather than using traditional proprietary languages.

A fool’s paradise?

However, despite its current popularity, the open source software model is not a panacea. Banks should still have legitimate concerns about support, governance and traceability.

The code of an open source project may be available for anyone to review. But tracing the complex web of dependencies between packages can quickly become extremely complex. This poses significant risks for any financial institution that wants to build on open source software.

Essentially, if you build a credit risk model or a customer analytics application that depends on an open source package, your systems also depend on all the dependencies of that package. Each of those dependencies may be maintained by a different individual or group of developers. If they make changes to their package, and those changes introduce a bug, or break compatibility with a package further up the dependency tree, or include malicious code, there could be an impact on the functionality or integrity of your model or application.

As a result, when a bank opts for an open source approach, it either needs to put trust in a lot of people or spend a lot of time reviewing, testing and auditing changes in each package before it puts any new code into production. This can be a very significant trade-off compared to the safety of a well-tested enterprise solution from a trusted vendor. Especially because banking is a highly regulated industry, and the penalties for running insecure or noncompliant systems in production are significant.

What use is power without control?

When it comes to enterprise-scale deployment, open source analytics software also often poses governance problems of a different kind for banks.

Open source projects are typically tightly focused on solving a specific set of problems. Each project is a powerful tool designed for a specific purpose: manipulating and refining large data sets, visualising data, designing machine learning models, running distributed calculations on a cluster of servers, and so on.

This “do one thing well” philosophy aids rapid development and innovation. But it also puts the responsibility on the end user – in this case, the bank – to integrate different tools into a controlled, secure and transparent workflow.

As a result, unless banks are prepared to invest in building a robust end-to-end data science platform from the ground up, they can easily end up with a tangled string of cobbled-together tools, with manual processes filling the gaps.

This quickly becomes a nightmare when banks try to move models into production because it is almost impossible to provide the levels of traceability and auditability that regulators expect.

Language doesn’t matter

The good news is that there’s a way for banks to benefit from the key advantages of open source analytics software – its flexibility and rapid innovation – without exposing themselves to unnecessary governance-related risks.

The language a bank’s data scientists choose to write their code in shouldn’t matter. By making a clean logical separation between model design and production deployment, banks can exploit all the benefits of the latest AI tools and frameworks. At the same time, they can keep their business-critical systems under tight control.

SAS plus open source

One SAS client, a large financial services provider in the UK, recently took this exact approach. The client uses open source languages to develop machine learning models for more accurate pricing. Then it uses the SAS Platform to train and deploy models into full-scale production. As a result, model training times dropped from over an hour to just two and a half minutes. And the company now has a complete audit trail for model deployment and governance. Crucially, the ability to innovate by moving from traditional regression models to a more accurate machine learning-based approach is estimated to deliver up to £16 million in financial benefits over the next three years.

Digital transformation: from buzzword to reality

Business’ biggest buzzword, digital transformation, has taken the technology world by storm. Despite being around since the 1990s, it has recently become a term that permeates almost every business strategy or vision for the future. Today, every technology provider claims it can enable digital transformation, and all well-informed CEOs are mandating it to drive their business forward.

The key to unlocking this business growth starts with technology’s second-biggest buzzword: data.

Barriers to a data-driven digital transformation

While data may be ‘the new oil’, an uncontrollable explosion of unrefined data doesn’t add any value to a business. You must be able to sort, process and examine data, view it from different angles and understand how to extract intelligent insights from it. To do this, having the right infrastructure in place is key. Indeed, large businesses in particular often struggle with legacy systems that aren’t designed to handle the volume of data we now produce and consume.

For finance departments, reliance on outdated technology is certainly part of the problem, but there are also other issues that need to be addressed. A recent BlackLine survey examining attitudes to financial data revealed that the C-suite’s top perceived challenge was that data was from too many sources and there was uncertainty over whether it was all being accounted for. Over a quarter of C-suite executives and finance professionals (28%) claimed that there were not enough automated controls and checks for the volume of data they had to deal with and that the process of collecting and processing the data was too complex (also 28%).

The key to unlocking this business growth starts with technology’s second-biggest buzzword: data.

So, what steps can finance professionals take to address these challenges? And what questions should finance departments ask as part of their quest to become truly data-driven?

Asses your data foundation

The vital first step for any transformation journey is to assess how far along the road you’ve already travelled. It might seem obvious that real-time access to accurate, reliable data – including financial data – can be used for strategic analysis and to create a competitive edge. But what may be less obvious is how damaging it can be to base this analysis on poor quality, unstructured or untrustworthy data.

According to BlackLine’s survey results, almost seven in 10 respondents believed that either they themselves or their CEO had made a significant business decision based on out-of-date or incorrect financial data. Not only can tapping into poor data compromise the decisions you make, but it can also seriously hamper your organisation’s ability to transform longer-term.

Is your data accurate?

With this in mind, the first question any organisation or department should consider is whether the data they do have is accurate. Is there real confidence in the precision of your data; can you be confident in the decisions you make from it? If there are inaccuracies, where are they coming from, and what processes or controls can you put in place to improve this?

The first question any organisation or department should consider is whether the data they do have is accurate.

In the finance department, clunky spreadsheets and outdated processes often leave finance teams in the dark until month-end, resulting in rushed work, manual workarounds and an increased risk of human error. By automating manual, predicable and repeatable processes, such as transaction matching or journal entries, data not only becomes more reliable but time is also freed up for more valuable work.

Is your data expansive and up-to-date?

Once you are comfortable that manual tasks have been automated, and are confident in the data being used to drive decisions, the next thing to consider is whether your data sets are expansive enough for intelligent analysis. Having a foundation of clean, relevant data is fantastic, but there must be enough of it to reliably answer pressing business questions.

At the same time, it’s vital to examine whether the data you do have access to is actually up-to-date. After all, why use data that is a month old to make decisions now? Continuous accounting, for example, which shifts the finance department from a monthly to a near real-time data cycle, makes it easier to deliver forward-looking, strategic insights that benefit the rest of the business.

Is today’s data ready for tomorrow’s demands?

Finally, finance departments need to ask whether the data foundations they are building today will be fit for purpose tomorrow. Much of this comes down to a question of trust: do you trust the processes you have in place to deliver data that is accurate, reliable, scalable and usable – not only now, but also in the long run?

Finance departments are bracing themselves for further technological disruption. A lack of trust in your data today not only has implications for human decision making, but it could also impact technology that learns from this data further down the line. While truly intelligent technology, like AI, may not be a reality for the finance department just yet, establishing its data integrity now will ensure it is ready for these advances when they do arise.

 

In this new model, decision-makers not only have to make strategic choices more quickly but also need instant access to the right information to ensure those decisions are well-informed.

For CFOs, this means being able to make agile investment decisions but with so many potential ways to go - how can we gather fast, accurate insight to ensure we make the best choices? And, just as important - how can we understand where we’ve made the wrong decisions so we ‘fail fast’ and move on? Halvor W. Stokke, CFO of Confirmit, answers these questions.

Moving beyond numbers

Those outside the finance department often still believe that all that keeps us awake at night is numbers. Of course, the reality is that the finance function has evolved just as much as all other aspects of our organisations in recent years. Numbers are just our route to information - they are passive and only provide part of the story. They’re certainly not the objective of a CFO’s role. Or at least, they shouldn’t be.

To do my job properly, and to make the best decisions for the company as a whole, I need insight – just as any other business department does. And that means being able to understand the bigger picture of our organisation, taking into account external forces such as market trends and the competitive landscape, as well as the broader economy. There is also a host of internal factors to consider spanning product, service, operations, employment and customer practices.

But failing fast in decision-making and investment choices is actually about creating long-term success – by learning from the knowledge we gather at every decision point and adapting our future choices as a result.

This bigger picture which brings all of these elements together simply can’t be gained from numbers alone. It relies on a careful combination of insight gathered from across the business and presented in a way that tells us, based on clear evidence, how the investments and finance decisions we make will affect our strategic goals and our specific business KPIs.

Insight gathered at speed

But we don’t only need this holistic insight. We need it quickly and continuously. We need to be as agile – if not more so – than the market we serve and the competitors within it.

As a software organisation, we’re used to the fail-fast approach that’s long been associated with agile product development. We know that speed of delivery is often more important for success than first-time perfect delivery. Being agile in this way means we can continue to perfect our product while it’s already in the marketplace and deliver value to customers. It also means we’re much more likely to align with the changing needs of those customers.

The modern role of the CFO needs to follow exactly the same approach. Gather as much insight as we can, as accurately as we can, and then make the finance and investment decisions that we believe will have the greatest positive impact at that moment in time.

Our decisions may not always be perfect, but because we can be agile, we can make new decisions more quickly – offsetting the potential impact of previous wrong choices. We also gain the knowledge we need to pull investment more quickly when needed, rather than continuing to invest time, money and resource into a route of poor return.

Integrated data from across departments provides the additional benefit of linking cause and effect, giving department heads the evidence they need for future investment requests.

Failure is an option

Of course, no one wants to be associated with failure. It’s human nature to want our decisions to succeed, and the fundamental goal of a business’s senior leadership team is success and growth. But failing fast in decision-making and investment choices is actually about creating long-term success – by learning from the knowledge we gather at every decision point and adapting our future choices as a result.

Rather than failing fast, I call this ‘knowledgeable speed’. That’s because we’re making immediate informed, data-driven choices to maximise our chances of long-term ROI. This means the modern CFO role is now much more aligned to strategic business development than to fiscal calendars and quarterly reports. Of course, financial and accounting processes and procedures will always be adhered to, but they are part of our reporting suite and no longer an end goal in themselves.

Harnessing the best sources of insight

With such a focus on agility and speed, it may seem odd that we’d see investment in long-term, continuous Employee Experience and Customer Experience programmes as a critical component of an agile corporate strategy. But that’s exactly the approach we advise.

That’s not only because employees and customers are the most valuable asset for any organisation. It’s also due to the fact they are the most accurate barometer of market trends, providing the leadership team with a view on the pulse of a market in continual flux.

Used in the right way, the insight gathered from these two groups can be the catalyst for highly profitable organisational transformation. Not only can it help to predict changing behaviours and inform new strategic direction, but a continual, two-way dialogue with both customers and employees ensures that they are on board with change as it happens.

This is not just a ‘touchy-feely’ approach to management, but a real driver of success, since change driven by everyone is much more likely to lead to long-term results than initiatives led by an individual’s ‘vision’.

A cross-functional approach

It’s this approach to embracing wider business and market insight that sets forward-thinking leadership teams apart from the crowd. When CFOs work with other functions to understand the challenges and opportunities that exist around the business, it’s more likely that we’ll make informed investment decisions. The wider effect of this is that can simultaneously improve a range of KPIs and positively impact the bottom line.

For example, if we work more closely with CMOs, we’re able to create an accurate picture of how the customer experience we deliver impacts financial performance. Similarly, linking our work with HR heads gives us better insight into how employee engagement may be affecting sales, customer retention or service levels.  Individually, we can’t make this correlation as, naturally, the data we gather is departmentally siloed.

Aligning data and leadership culture

Integrated business data and insight can only work, however, if we have a closely aligned leadership team. Working cross-departmentally supports our holistic, ‘fail fast’ approach to decision-making because we all understand the fuller business picture and can better identify opportunities for change and growth – regardless of where initiatives begin.

What’s more, each department can prove their individual impact on KPIs, giving a greater understanding of the improvements or changes needed to enhance both departmental and overall business performance.

Integrated data from across departments provides the additional benefit of linking cause and effect, giving department heads the evidence they need for future investment requests.

A continual evolution

Of course, just like the industries in which we operate and the markets we serve, our own roles are continually evolving. While a CFO is still accountable for the financial health of an on organisation, we’re also contributors to a much wider range of decisions than we were five years ago.

Our roles will continue to change as the lines between ‘ownership’ become increasingly blurred. We’re no longer owners of the balance sheet, just as sales is no longer the owner of customers – that’s a responsibility that falls to every employee in a truly customer-centric business.

So, if as a CFO I need to drive financial success in an agile, ever-changing industry, understanding numbers is no longer enough. Understanding everything about my business is now the minimal requirement for staying ahead.

 

About the author:

Halvor W. Stokke joined Confirmit as CFO in 2017 and holds responsibility for the company’s financial stability and growth. In this position, he focuses on the long-term strategy for Confirmit, including both organic growth and all merger and acquisition opportunities.

Website: www.confirmit.com

Nowadays, it’s incredibly easy to buy bitcoin and the advantages and disadvantages of bitcoin are starting to be less blurry every day.

Despite all of that, we are still a long way from bitcoin mass adoption. Sure, there are a lot of businesses starting to accept bitcoin as payment, but there is still a huge chunk of the world that is unaware of how cryptocurrencies work. Some people are even unaware that crypto exists. Those are the people that need to be included in a global peer-to-peer currency.

Why people should start using cryptocurrencies

Whether we like or not, cryptocurrencies are now part of the global economy. If more people could see why people buy, sell and trade these cryptocurrencies, then maybe we could move one step closer to mass adoption. Here are some of the main reasons why people should start using cryptocurrencies.

Fees

If you have a bank account, then you should know by now that these accounts usually have fees associated with them. Credit/debit card fees, ATM fees, merchant fees, checking account fees, etc.

Compare to cryptocurrencies, payment gateways such as Bitpay and Coinpayments charge between 0.5 and 1% per transaction. Compared to those fees of the bank, this is nothing. Digital wallets come free of charge (unless you decide to start investing in hardware wallets).

Privacy

When you make a purchase using your credit or debit card, the bank, as well as the retailers and service providers, obtain and retain a lot of your personal and financial information. This information includes names, addresses, employers, social security numbers, etc.

Cryptocurrency transactions provide an alternative by limiting the data to a string of letters and numbers (a.k.a. A wallet address). Transaction IDs are also used to confirm that a wallet-to-wallet transaction took place.

Globalization

3 words = cryptocurrencies are borderless. Transactions are not only instant and cost-effective, but you can also make these transactions across the world. There is no waiting, no international fees, and no limitations. All you need is a smart device that can connect to the internet. Because of this, the unbanked population has an alternative solution when it comes to paying bills and earning a living.

How people can start using cryptocurrencies

There are many exchanges out there that will help you get started on your crypto journey. Remember to do extensive research on all your potential bitcoin exchanges so that you can decide which one suits your trading style the most. If you are unsure about which bitcoin exchange to use, try making small investments on your top platforms and see which system works for you. Once you’ve settled on a platform, you can start making larger investments.

People can also start using crypto if other people give them the opportunity to use it. Adoption is the key here. The more businesses offer crypto as a payment option, more and more people will be more likely to follow. If more trading platforms showcase how crypto can help unbanked people around the world, more people will start to use it. It all starts with the community. For more article related to cryptocurrency you can check etherum mining software & Updates.

Riding the wave

Bitcoin and other cryptocurrencies are in a more stable place now and people are starting to hop on the ride. All that stands in the way of mass adoption is people being educated on how cryptocurrencies work and what good they can bring to the world.

The crypto community owes it to itself. The more positive crypto news there is, the more other people will be attracted to the technology.

Almost a third of these breaches were down to organisations neglecting simple security procedures, whilst over three quarters were caused by issues at the application layer, often related to out-of-date software, insecure third-party payment systems, or inadequate scanning. All of these breaches therefore contravened Payment Card Industry Data Security Standard (PCI DSS) requirements.

In one organisation, up to 40 employees used the same password for the server, and had full admin rights to the overall system. Another case saw a coding error present in the website login page, which enabled an attacker to obtain usernames and password hashes – ultimately allowing access to the organisation’s web server.

The analysis also revealed that the £1.74 million in fines issued for these incidents by the ICO in this time period could have amounted to almost £889 million under the General Data Protection Regulation (GDPR).

Phil Bindley, managing director at data centre and managed service provider, The Bunker commented: “PCI DSS compliance is a continuous journey and one that requires regular assessment to identify any weaknesses across an organisation.

“Regulators aren’t going to be lenient about failings in this space, and if businesses don’t invest enough into improving defences, we’re going to see more organisations having to pay the price for a relaxed approach to security.”

Simon Fletcher, managing director at cyber security specialist, Arcturus added: “We’re still seeing businesses failing to implement even basic measures when it comes to securing sensitive information.

“The need for regular and thorough testing is clearly outlined by PCI DSS, and is something that is still forgotten by many or causes confusion, particularly when it comes to the application layer. Testing systems is vital in order to ensure that any issues are quickly addressed to prevent data being put at risk.”

Following on from the success of our pilot festival in New York last December, DATAx is proud to present the next instalment of our global series of data-driven festivals, DATAx Singapore, offering 4 stages, 50 speakers & 450 Innovators. As one of the leading players in smart city technology, with autonomous vehicles, smart sensor platforms and applications of artificial intelligence (AI), it’s a rarity that the city-estate doesn’t make headlines in technology news regularly.

Global consulting group McKinsey estimates that 70% of businesses are expected to rely on AI to automate functions by 2030, adding $13 trillion growth to the global economy. Contrary to popular belief, the World Economic Forum has stated that the shifting dynamics between machines and humans will add an estimated 133 million new roles to the workforce.

The total impact of AI for business remains a looming question for many companies; DATAx aims to remedy this uncertainty by preparing business leaders to address topics like machine learning, analytics, data talent and big data.

With a worldwide community feeding into the series, DATAx remains dedicated to connecting delegates with the latest research, cutting-edge technology and the hottest startups to share highlights and breakthroughs in artificial intelligence (AI) and data science reshaping the world.

Luke Bilton, Managing Director of Innovation Enterprise, commented: “While AI has now become a real opportunity, it also brings real 'survival of the fittest' challenges – only fast-moving businesses are most likely to succeed.

“DATAx is a new style of event, designed to arm early adopters with the tools they need to move fast. We are excited to welcome the world's most innovative brands and partners to build something unique.”

In less than 2 weeks’ time, over 500 data leaders, from tech giants, financial innovators, emerging startups, government bodies and many more will gather at DATAx Singapore. Confirmed speakers from leading brands include American Express, Citi Bank, Visa, Boston Consultant Group, IBM, Netflix, Oracle, AIA, Axiata, Dyson, Singapore Exchange Limited, Google and many more.

The agenda features 50+ speakers covering Asia's critical data solutions in business practice. Grasp the chance to access to the unparalleled learning opportunities and get the latest AI and Machine Learning applications in Technology, Finance, Marketing, Smart Cities, across 5 stages.

SESSION HIGHLIGHTS INCLUDE

• ‘Omnichannel Personalisation Like You've Never Seen’, Client Team and Global E-commerce/Data Lead, WPP
• 'Leveraging analytics for more efficient media attribution and allocation', Group VP, Head of Analytics, Axiata
• 'Combining human and artificial intelligence: how creatives and data scientists work in concert at Netflix to craft a personalized experience.' Senior Data Scientist, Netflix
• 'Creating a data-driving go-to-market strategy', Head of Applied Data Science, Dyson
• 'Deep learning and computational grapy techniques for derivatives pricing and analytics', Head of Data Science and Visualisation at Singapore Exchanged Limited
• 'Ready or not: does your organization have a sound data strategy to run successful AI projects?', Big Data & Analytics, ASEAN, Oracle
• 'Organizational cultural changes required for success in Artificial Intelligence', CDO, Head of Science, Visa
• 'Machine learning and AI in American Express risk management', VP, Fraud Risk Decision Science, American Express

Don't miss out: Click here to register

Unfortunately, by prioritising ad-hoc incident resolution, organisations struggle to identify and address recurring data quality problems in a structural manner. So what is the correct approach? Boyke Baboelal, Strategic Solutions Director of Americas at Asset Control, answers the question for Finance Monthly.

To rectify the above issue, organisations must carry out more continuous analysis, targeted at understanding their data quality and reporting on it over time. Not many are doing this and that’s a problem. After all, if firms fail to track what was done historically, they will not know how often specific data items contained completeness or accuracy issues, nor how often mistakes are made, or how frequently quick bulk validations replace more thorough analysis.

To address this, organisations need to put in place a data quality framework. Indeed, the latest regulations and guidelines increasingly require them to establish and implement this.

That means identifying what the critical data elements are, what the risks and likely errors or gaps in that data are, and what data flows and controls are in place. By using such a framework, organisations can outline a policy that establishes a clear definition of data quality and its objectives and that documents the data governance approach, including processes and procedures; responsibilities and data ownership.

The framework will also help organisations establish the dimensions of data quality: that data should be accurate, complete, timely and appropriate, for instance. For all these areas, key performance indicators (KPIs) need to be implemented to enable the organisation to measure what data quality means, while risk indicators (KRIs) need to be implemented and monitored to ensure the organisation knows where its risks are and that it has effective controls to deal with them.

A data quality framework will inevitably be focused on the operational aspects of an organisation’s data quality efforts. To take data quality up a further level though, businesses can employ a data quality intelligence approach which enables them to achieve a much broader level of insight, analysis, reporting and alerts.

This will in turn allow the organisation to capture and store historical information about data quality, including how often an item was modified and how often data was erroneously flagged. More broadly, it will enable organisations to achieve critical analysis capabilities for these exceptions and any data issues arising, in addition to analysis capabilities for testing the effectiveness of key data controls and reporting capabilities for data quality KPIs, vendor and internal data source performance, control effectiveness and SLAs.

In short, data quality intelligence effectively forms a further layer on top of the operational data quality functionality provided by the framework, which helps to visualise what it has achieved, making sure that all data controls are effective, and that the organisation is achieving its KPIs and KRIs. Rather than being an operational tool, it is effectively a business intelligence solution, providing key insight into how the organisation is performing against its key data quality goals and targets. CEOs and chief risk officers (CROs) would potentially benefit from this functionality as would compliance and operational risk departments.

While the data quality framework helps deliver the operational aspects of an organisation’s data quality efforts, data quality intelligence gives key decision-makers and other stakeholders an insight into that approach, helping them measure its success and demonstrate the organisation is compliant with its own data quality policies and relevant industry regulations.

The financial services industry is starting to focus more on data quality. In Experian’s 2018 global data management benchmark report, 74% of financial institutions surveyed said they believed that data quality issues impact customer trust and perception and 86% saw data as an integral part of forming a business strategy.

Data quality matters. As Paul Malyon, Experian Data Quality’s Data Strategy Manager, puts it: “Simply put, if you capture poor quality data you will see poor quality results. Customer service, marketing and ultimately the bottom line will suffer.”

In financial services with its significant regulatory burden, the consequences of poor data quality are even more severe. And so, it is a timely moment for the rollout of the multi-layered approach outlined above, which brings a range of benefits, helping firms demonstrate the accuracy, completeness and timeliness of their data, which in turn helps them meet relevant regulatory requirements, and assess compliance with their own data quality objectives. There has never been a better time for financial services organisations to take the plunge and start getting their data quality processes up to scratch.

This week Finance Monthly hears from Mohit Manchanda, Head of F&A and Consulting EXL Service UK/Europe at EXL, on the ever-evolving DNA of a CFO.

Business leaders have to stay relevant and ahead of the curve and adapt to the constantly evolving world of finance. This development has become ever apparent for the Chief Financial Officer (CFO) whose role now includes, strategies, operations, communication, and leadership as well as building knowledge surrounding the impact of emerging technologies within the finance sector.

Business outcomes

Advances in data software and automation are opening up avenues for businesses to generate valuable insights that can lead to major productivity improvements. Within the finance and accounting areas, technology is becoming a catalyst for change, driving innovation and providing operational efficiency in business-critical functions.[1] It is essential for CFOs to rethink how to utilise this opportunity to streamline their processes for efficiency, compliance and risk management.

CFOs have many objectives to commit to and by using cutting-edge solutions to enhance the transparency and accuracy of financial data, they can better manage the financial management process. Using automation within finance helps to free up high-value tasks and alleviates the pressure on the CFO to perform traditional activities such as, transaction processing, auditing and compliance.

Human X Machine

It is becoming more and more evident that the CFO will be looked up to, to drive the utilisation of new technologies, however they should try not to get ahead of themselves and forget about the day to day business. Becoming too attached to the hype surrounding Automation and Analytics can put other business objectives on the back burner. For example, managing costs and coming up with new ways to generate profit are tasks that require the CFO to use their own industry knowledge rather than relying on data or analytics.

New technologies can speed up processes and lessen tasks for CFOs; it is important for them to make choices and identify processes where AI, Automation and machine Learning adds value. An investment in one area of a business can create savings in another. In most companies, a high percentage of staff still perform tasks that can be automated through Machine Learning, and these tasks can be performed exponentially faster if self-learning algorithms are applied.

Given the pace of technological change, CFOs should carefully evaluate their point of entry and roll out multiple pilots or proofs of concept (PoC) to test and secure validation before deploying these new technologies.

New technologies can speed up processes and lessen tasks for CFOs; it is important for them to make choices and identify processes where AI, Automation and machine Learning adds value.

Introducing innovative technologies within the finance sector does aid in mitigating lesser tasks for the CFO, however it is not only the technology alone that enables a more streamlined work process. By combining talent, skill set and technology together creates a unified approach, resulting in major improvements throughout the business. For CFOs it means that they can move away from everyday traditional accounting tasks, therefore freeing up time to use their industry knowledge to focus on new business opportunities and provide strategic guidance.

Data & Domain

Organisations regardless of their size will collect large masses of data of which most will never be utilised. It is important for CFOs to understand which data sets are of value and which ones aren’t. Some may be needed for regulatory purposes and others for commercial predictions and products, however by disregarding the sets that are not of value helps to create a more streamlined result.

Starting to experiment with data will help identify potential risks before they are put into production. Machine Learning is all about data experimentation, hypothesis testing, fine tuning data models and Automation. Bringing data, technology and talent together in the form of ideation forums, innovation labs and skunk work projects allows discrete data to be tested for the first time. By bringing in Machine Learning, it can identify hidden patterns that could potentially harm the production process.

In order to drive the business forward, CFOs can translate data and combine it with industry knowledge. The data helps to provide insight within the industry which then contextualises their business decisions. Using data driven decisions CFOs can be confident in their choices within the organisation and use it to back up or prove their conclusions.

Putting data under the business lens enables a CFO to understand the repercussions that can occur through the improper use of big data. A business’ reputation is on the line if data violations occur. Not only will this result in legal sanctions, it will limit business operations, which will have a domino effect on resources and a company’s position compared to its competitors.

Therefore, CFOs should review all of the potential consequences before putting their experimented data findings into practice, including any legal, financial, and brand implications. This is where industry knowledge comes into play, using an expert committee on business data to inspect algorithms for unintentional consequences, results in less risk than normally associated with Machine Learning.

For CFOs to thrive in the digital age, it is essential for them to have a unified approach combining industry knowledge, data, technology and talent.

For CFOs to thrive in the digital age, it is essential for them to have a unified approach combining industry knowledge, data, technology and talent. By employing new technologies, data, talent and knowledge as one package, CFOs can add continuous learning opportunities for critical talent pools, and assist in the overall improvement of productivity within the business.

[1] https://www.business2community.com/big-data/17-statistics-showcasing-role-data-digital-transformation-01970571

Ralf Gladis, CEO of Computop, answers questions surrounding regulation and global consensus, with some interesting pointers on privacy and trade therein.

Cryptocurrencies are expected to reach a major turning point in 2019, but they still attract a great deal of controversy. There is no doubt that the digital currency market is growing, and fast, but support from the institutions that matter is far from consistent.

In November, Christine Lagarde, head of the IMF called for governments to consider offering their own cryptocurrencies to prevent fraud and money laundering. Governments, by contrast tend to err on the side of caution, with the vast majority sceptical of what they see as the ‘Wild West of crypto-assets‘ in which investors put themselves at unnecessary and heightened risk. In part this is because a core role of government is to prevent turmoil in central systems, however many have acknowledged that cryptocurrency has a momentum that cannot be ignored and that regulation could help to bring about a more sustainable and less volatile crypto environment.

The scenario is changing all the time, and it is worth considering what would actually happen if all governments agreed that digital currencies were good:

  1. Currency formats: If all governments loved crypto currencies they would probably not love the same currency, so if one country introduced Bitcoin and another Ethereum, we would then be faced with the difficulties of handling the exchange.
  2. Economic Policy: The value of money is a playground for politicians of all sides. Expanding the availability of money, for instance, leads to devaluation of a currency which is supposed to help export-orientated economies when selling goods and services abroad. Such policies can only work if a government has the sole power to expand or decrease the amount of money within its own economy. No central bank would be willing to give that power away. That’s why we would end up with many crypto currencies in different countries.
  3. Regulation: It‘s vital for a government to avoid money laundering, fraud and tax evasion. This is simply necessary to protect the country from financial crime and to comply with international rules. Therefore, a crypto currency would be regulated by each country’s central bank according to current local requirements for Anti Money Laundering (AML) and Know-Your-Customer (KYC).
  4. Cash: Despite the availability of crypto alternatives we wouldn’t get rid of cash quickly. With no experience of what a non-cash society means, there are huge risks simply because of a fascination with a new technology. What about people who are travelling abroad, or those who are unbanked?
  5. Privacy: A crypto currency can ensure privacy. However, it can also be designed to be open and very transparent. If crypto currency was THE new currency it would need to be transparent to regulators and criminal investigators. If the design were open to government access this could cause a privacy nightmare. Currently, payment data is distributed over many issuing and acquiring banks. Accessing this legally is not easy and requires a judge. A large transparent crypto currency database which is open to governments sounds like an invitation for misuse by government agencies that might mean well but would do ill anyway.
  6. Trade: B2C transactions require payment schemes that act as a mediator between merchants and consumers. Schemes like Visa and MasterCard have established a worldwide rule-set that balances the interests of merchants and consumers. What if a fraudster used a fake identity and the actual consumer required the merchant to pay back his money? What if a consumer sent back a few products and required a partial refund? And if the merchant failed to react? Many such exceptional but nonetheless possible scenarios are the reason why issuing and acquiring banks have to enforce the rules set by Visa and MasterCard. That also applies to other payment systems like American Express, Discover and PayPal who set and enforce their rules themselves directly with both consumers and merchants. B2C payment needs schemes. In that respect it doesn’t matter whether the currency is digital, physical or crypto.
  7. Ecology: Several central banks have already tested crypto currencies. The result was devastating. For large scale use crypto currency is much too slow and requires too much energy and storage consumption to be feasible.

It looks like there is still a lot of work to be done before crypto currency gets anywhere near to being acceptable to governments.

When the General Data Protection Regulation came into force in May, it affected every company that does business within the European Union and the European Economic Area EEA. Its main purpose is the protection of each individual’s data, but their privacy and compliance obligations have put a significant burden on companies of all sizes and across all sectors.

Similar legislation exists in Turkey, although there are distinct differences. On one notable point, however, they are in harmony: just as not complying with GDPR requirements carries substantial penalties, so does any breach of Turkish provisions. Failure to comply can lead to administrative fines and criminal penalties. As a result, every company that does in Turkey already, or which plans to do so, needs to be aware of how these laws might affect their operations.

Partly in anticipation of GDPR, Turkish Data Protection Law (DPL) was enacted in 2016. Turkey’s supervisory authority, The Personal Data Protection Board (DPB), is still publishing assorted regulations and communiqués relating to it, as well as draft versions of secondary legislation. Under these changes, data controllers who deal with personal data are subject to multiple obligations. In addition, the legislation also applies to ordinary employees, making it significant for every company operating in Turkey.

The grounds for processing under DPL are similar to GDPR - saving that explicit consent is needed when processing sensitive and non-sensitive personal data.

So when comparing DPL with GDPR, what are the differences that impact businesses operating in Turkey? Although it stems from EU Directive 95/46/EC, DPL features several additions and revisions. It does, however, contain almost all of the same fair information practice principles, except that it does not allow for a “compatible purpose” interpretation and any further processing is prohibited. Where the subject gives consent that data may be compiled for a specific purpose, the controller can then use it for another purpose as long as further consent is obtained, or if further processing is needed for legitimate interests.

The grounds for processing under DPL are similar to GDPR - saving that explicit consent is needed when processing sensitive and non-sensitive personal data. Inevitably, this is much more time-consuming. Such a burdensome obligation would initially make it seem that DPL provides a higher level of data protection compared to GDPR, but DPL’s definition of explicit consent also has to be compared to GDPR’s regular consent. ‘Freely given, specific and informed consent ‘ is common to both, while GDPR further requires ‘unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her’.

While DPL consent might appear to be less onerous than GDPR, no DPB enforcement action has yet occurred: interpretation of explicit consent therefore remains uncertain. Under DPL, the processing grounds for sensitive personal data are notably more limited than under GDPR – with the exception of explicit consent, the majority of sensitive personal data can be processed, but only if it is currently permitted under Turkish law. The sole exception is data relating to public health matters.

Controllers have to maintain internal records under GDPR, whereas DPL does not make any general requirement to register with the data protection authorities.

Equally burdensome under DPL is the cross-border transfer of personal data to a third country. As determined by the DPB, the country of destination must have sufficient protection – either that, or parties must commit to provide it. DPL also states that: “In cases where interests of Turkey or the data subject will be seriously harmed, personal data shall only be transferred abroad upon the approval of the Board by obtaining the opinion of relevant public institutions and organisations”. Under this provision, data controllers must decide whether a transfer could cause serious harm, and if it does, they need to obtain DPL approval. However, it is unclear how these interests might be determined.

Controllers have to maintain internal records under GDPR, whereas DPL does not make any general requirement to register with the data protection authorities. Instead it has a hybrid solution: registration and record-keeping requirements. DPL specifies a registration mechanism: data controllers have to register with a dedicated registry. Under a draft DPB regulation, before completing their registration they are required to hand over their Personal Data Processing Inventory and Personal Data Retention and Destruction Policy to the DPB.

For businesses which have to comply with DPL, GDPR, or both, it would be prudent to ensure that they are not duplicating their efforts. The best way to achieve this is by aiming for a flexible compliance model that successfully meets the obligations of the regulatory authorities across multiple jurisdictions.

Website: www.kilinclaw.com.tr/en/

 

About Finance Monthly

Universal Media logo
Finance Monthly is a comprehensive website tailored for individuals seeking insights into the world of consumer finance and money management. It offers news, commentary, and in-depth analysis on topics crucial to personal financial management and decision-making. Whether you're interested in budgeting, investing, or understanding market trends, Finance Monthly provides valuable information to help you navigate the financial aspects of everyday life.

Follow Finance Monthly

© 2024 Finance Monthly - All Rights Reserved.
News Illustration

Get our free weekly FM email

Subscribe to Finance Monthly and Get the Latest Finance News, Opinion and Insight Direct to you every week.
chevron-right-circle