finance
monthly
Personal Finance. Money. Investing.
Updated at 16:55
Contribute
Premium
Awards

Another financial year has passed, and as you look back, will you seek to do things differently next time around? Below, Dean Snappey, the President and Co-Founder of DocsCorp discusses with Finance Monthly 5 simple accounting tools that’ll make your life that much easier to navigate at this time of year.

Over the past 12 months we have seen considerable adjustments to taxation, such as changes to dividend taxation and the recent increased tax for landlords. Aim to prevent the end of year mess and avoid the kind of errors that carry implications to your and your company’s reputation.

There are several accounting tools and software solutions available at your fingertips to ease the process, stay organised and plan ahead. Make the most of these accounting tools and follow these five easy steps to make pre-emptive tax planning simple.

  1. Bundle Documents. Best practice is to ensure any invoices, statements and paperwork are scanned and saved as a PDF document as paper filing systems and loose documents are at risk of being lost or damaged. Document bundling software gives you the ability to collate PDF content and bind all relevant documents such as tax returns, invoices and financial accounts into a single file for each client. By bundling all important documentation, it will make the process of finalising paperwork much easier as all documentation required will be stored together. Finalised documents can then be sent as a single secure file in one hit rather than multiple attachments, saving you time with each client and avoiding the risk of lost or separated files.
  2. Redaction. As part of the process of preparing documents with such complex data, it is essential to redact classified or sensitive information. To redact is to remove, therefore ‘redacted’ is often used to describe documents from which sensitive information has been expunged. By using a redaction tool it is easy to search whole documents to redact multiple instances of a word or string of words in a way that is quick, easy and fail-proof. Redact personal information or bank account details in documents as part of an audit.
  3. Make Scanned Documents Searchable. When you scan invoices, receipts and finalised paperwork onto your desktop, the scanned documents are image documents that do not have the text layer that’s needed to make them searchable. You can make the documents text-searchable with an Optical Character Recognition (OCR) tool. This software converts scanned, printed or handwritten files into its machine readable text format.  You can then search whole documents for a word, phrases or a set of numbers which makes finding information or documents for tax audits a breeze.
  4. Remove Metadata. Metadata may appear hidden, however, every time you annotate, edit or alter a document, your decision is stored invisibly within the very document you’re working on. Metadata removal software removes metadata from files – eliminating any risk of unintentional information leakage. To put it simply, the metadata cleaner will ensure you only send the part of the document or spreadsheet you intend to – exactly what would be printed out on a piece of paper. Remove hidden cells and other sensitive metadata before uploading or attaching files intended to be sent to clients.
  5. Compare Documents for Change. Working with numerous clients means navigating through each client’s financial accounts and complex paperwork. Throughout the year many changes can occur within clients’ businesses, often meaning multiple changes to important documentation. Analysing large volumes of documentation, in which precision is required, can be very time consuming. There is a simpler way to undertake this task, which is by using a Comparison software package. This enables you to compare and analyse the differences between two documents with incredible accuracy and reliability. The software is able to show you the smallest change in documents, spreadsheets, PDFs without needing to convert them first. Save time and never miss an important change again.

Rob Mellor, General Manager at Wherescape, explains the process from data to decision, and how any business, large or small, needs to make its data amalgamation efficient in order to move forward.

They say moving house is one of the most stressful things you can endure. Having moved recently, I can confirm the old adage rings true! And this was despite paying extra for packers to do all the 'hard work'. But here's the thing: the packers didn't really save us much time because prior to them arriving, we had to spend weeks sorting and prepping our belongings into the right piles to then be boxed and shifted! This is effectively what data architects have to do if they choose not to automate the building of a data warehouse.

Imagine if I could automate the sorting of all my belongings into neatly organised boxes. Then imagine if I could automate the sorting of many families’ belongings without having to visit their houses - even taking into account the unique requirements that each individual customer has. In effect, this is what solutions like WhereScape do: help businesses to intelligently automate the gathering of data, and allow them to dramatically speed up the time it takes to drive value from it. Automating the process of data gathering can drive real business value and provide a flexible, templated approach to automation, personalised for every business requirement.

To give a real world example, Xerox Financial Services (XFS), a $2bn business spanning 14 countries, has the challenge of putting all kinds of data requests into its data warehouse and producing rapid, accurate business intelligence for its local leasing companies across 14 countries. The rapid growth of the company led to business structures growing up in parallel, creating disparate data and variations in business processes. In the past, these data sets had to be looked at in individual silos because of their breadth and complexity. This meant getting an accurate overall picture took so long that often the information was out of date by the time it was delivered.

XFS now consolidates and transforms all of its data sets into a harmonised model using WhereScape. Integrating multiple tables of data from each source has enabled XFS to create a variety of management reports, including an up-to-date snapshot of sales performance on a daily basis, allowing senior management to ensure targets are hit.

Over the last year, XFS has doubled the number of automated processes yet maintained data quality and decision making. Whereas previously the business had to rely on a monthly 'cycle' of data, it now uploads data every day allowing agile, fast and effective decision-making based on relevant, timely snapshot and trend data. By automating this data collection process, XFS can also engage more effectively with its partners. Through monitoring the value of each relationship, they have a better understanding of the number of proposals sent in by customers, average deal size and the number of order agreements. And the most tangible outcome? The level of automated credit decision-making has increased significantly without compromising the credit quality of the portfolio with complex statistical modelling being supported by data collected daily and transformed by WhereScape. This is a huge leap forward for XFS.

The only value of data is its ability to drive the right business decision. Yet we constantly see businesses failing to do this because of avoidable failures in how they manage it. The automation process XFS has deployed with WhereScape demonstrates that it doesn't have to be that way. There is a choice, and choosing the right process will drive a significantly improved commercial outcome. Now, if I can come up with something similar for helping with my packing the next time I get to move house..!

Technology is often remarked as evolutionary ammo, and the statement stands just the same for the growth of businesses. Finance Monthly below hears from Frédéric Dupont-Aldiolan, VP Professional Services at Sidetrade on the latest and upcoming innovations that have hit 2017 hard.

Artificial intelligence, robotics, machine learning and the Internet of Things: 2016 stood out as a year marked by technological development and significant advances in several fields, not least that of connected, driverless cars. Against this backdrop, a clear trend is appearing: the growing influence of robotic technology in daily life.

In 2017, we have seen more promising innovations, here is my review of the top five things we are seeing:

5. IoT, the Internet of Things

Star of the Consumer Electronic Show (CES), which took place in Las Vegas in January, and Viva Technology, which took place in Paris, the Internet of Things was thrust into the spotlight in 2016 and continues to bring increasingly intelligent connectivity to our daily lives. Smart devices, equipped with bar codes, RFID chips, beacons or sensors, are taking the lead and enabling companies to gain greater visibility over their transactions, staff and assets.

In 2016, information and technology research and advisory company Gartner estimated that there were 6.4 billion connected devices globally, an increase of 30% on 2015. By 2020, this figure is likely to have grown to 20.8 billion.

4. The explosion of Big Data

Network multiplication brings with it a proliferation of data generation, whose analysis, use and governance have become a burning issue. According to estimates by IDC, an international provider of market intelligence for information technology, by 2020, every connected person will generate 1.7MB of new data per second.

The concept of ‘perishable data’ has lost validity. In 2017, companies now have the capability to use data before it becomes obsolete. Devices connected via the Internet of Things will rapidly speed up data decoding and processing for actionable insight.

3. The ramp up of artificial intelligence and automatisation

Artificial intelligence has been one of the main talking points in technology over the last year. Encompassing areas such as machine learning, robotic intelligence, neural networks and cognitive computing, it’s now in daily use in numerous forms including facial and voice recognition, endowing velocity, variety and volume.

This year, artificial intelligence has taken on an increasing number of repetitive and automatable tasks, beginning with wider use of ‘chatbots’ with the capacity to give coherent, easily formulated responses. IDC pinpoints robotics driven by artificial intelligence as one of the six innovation accelerators destined to play a major role in the digitalisation of society and the opening up of new income streams. Indeed, Amazon and DHL are already making use of warehouse handling robots.

2. Location technology, the Holy Grail of customer satisfaction

Location technology has taken great strides over the last year or so, to the marked benefit of customer satisfaction in the hotel, health and manufacturing sectors. Customers can now receive geo-targeted offers on their smartphones, for example for promotions or reductions, depending on their physical location.

In 2017, RFID chips enable yet more accurate tracking of customers and enhancement of their buying experiences.

1. Virtual reality makes way for augmented reality

One of the biggest innovations recently has been virtual reality, and with it came much media coverage too. From Facebook to Sony, Google to Microsoft, big brands grasped this new technology to offer an outstanding user experience, through the merging of virtual and real imagery.

In 2017, these virtual devices have acquired an awareness of their environment and give users a real sense of immersion of the digital environment from within their own homes. The potential of augmented reality for business will be harnessed too in the coming months. Some companies, among them BMQ and Boeing, are already employing it to increase their retention and productivity rates, or to provide training to their workforces across worldwide subsidiaries.

Over the next few months, as we gear up for another round of product launches, we should expect to see advancements in these key areas of technological innovation. Within business, this technology should help to improve customer service by streamlining production and processes, saving time and money, as well as providing new and exciting ways to reach and engage with customers, helping to retain existing clients as well as bring in many new ones.

John Orlando is the Executive Vice President and CFO of Centage Corporation - a leading provider of automated budgeting and planning software solutions. With his previous experience concentrated on Financial Planning and Analysis, John has now been with Centage for over 13 years. Here he introduces Finance Monthly to the company and the services that it offers and discusses the relationship between business decisions and technology.

 

Could you tell us about the Company’s ethics and priorities toward its clients?

 Centage has been providing budgeting and planning software solutions for over 15 years. We understand that the most important aspect of your job is to develop accurate and timely budgets and forecasts that help you drive the growth and profitability for your company. Everything we do at Centage, from a client perspective; product technology, functionality; through to training, services and support, is dedicated to making the client experience unique. That is our number one priority.

 

Tell us more about the Budgeting and Forecasting services that Centage offers.

Budget Maestro by Centage is an easy-to-use, scalable, cloud-based budgeting and forecasting solution that eliminates the time-consuming and error-prone activities associated with using spreadsheets. It is designed for small to mid-market companies to support a comprehensive Smart Budgets approach to corporate planning. Its built-in financial and business logic allows users to quickly create and update their budgets and forecasts and never worry about formulas, functions, links or any custom programming. It is the only solution in the market that offers synchronized P&L, balance sheet and automatically generated cash flow reporting. Today, Budget Maestro serves more than 9,000 users worldwide.

 

How has Centage developed into the company that it is today?

 The company was created because the founders saw a need for a budgeting and forecasting solution that was more automated than what existed in the marketplace at the time. We respected the people and the processes that go into creating accurate and timely budgets and forecasts and thought there was a better way. We understood that giving financial professionals a tool that had all the financial and operational logic pre-built was crucial. This went against the traditional formula-based applications that were in existence. Additionally, Centage developed a full set of synchronized financial statements that included a Pro Forma Income Statement, Balance Sheet and Cash Flow that were automatically generated.

The CFO role in general is important to any company because it brings operational and financial discipline to the organization. I am involved with and required to be familiar with every facet of the organization from financial accounting to operations to human resources, etc. I believe these responsibilities, along with my experience in the FP&A arena building many budgets and forecasts over the course of 25+ years, has helped Centage to build the best budgeting and forecasting application that we could.

 

What is the role that technology plays in transforming data for better business decisions?

 Technology and business decisions are inexorably linked. All the advances in business over the past 50 years have been related to technology. It has given us the ability to take massive amounts of information from accounting systems, CRM systems and operational systems, condense them in one place and give businesses the ability to instantly review the information for trends and make informed decisions in a much shorter timeframe with little need for manual intervention.

In the case of a CRM system such as Salesforce.com, once you start to use the application it is difficult to fathom how you would have run your sales organization any other way. There are too many pieces of information to keep track of and too many data points could be missed.

Centage similarly has used technology to make our product, Budget Maestro robust and agile by eliminating all the mundane work associated with preparing budgets and forecasts. We specialize in building out all of the financial and operational finance logic so that the client, as the user, only needs to concentrate on building a set of good business assumptions. Our reporting solution, Analytics Maestro, gives our clients the ability to take the data in Budget Maestro or their resident accounting system, and manipulate and analyze the data very quickly, so that more informed business decisions can be made.

 

What do you anticipate for the sector in the near future?

One thing that has become clear over the past 2-3 years is that budgeting and forecasting is moving from the realm of isolated 12-month timeframes and annual budgets and forecasts, to more of a rolling budget / forecast approach that takes into account anywhere from 18- 36 month timeframes. This allows the user to plan for a much longer horizon.
Secondly, customers have been asking for budgeting and forecasting systems to reach out to other sub
ledger systems such as Salesforce, Payroll etc., to gather information, eliminating the need to manually intervene in the data gathering process.

 

Visit us at www.centage.com , follow us on Twitter, or visit the Centage Blog for the latest insights on budgeting and forecasting strategies.

Email: jorlando@centage.com

Phone: (508) 948-0024

 

Artificial intelligence is shaping the future of retail. Smart algorithms and data analyses are creating sustainable performance benefits across all levels of the retail supply chain.

With its Omnichannel ePOS Suite, Wirecard AG is the first payment provider to offer a fully integrated solution for self-learning analyses based on payment data in combination with other data sources. The evaluations substantially support e-commerce and high-street retail in implementing the following central growth concepts: increasing customer conversion, reducing customer attrition rates, predicting future consumer behaviour and linking points of sale with e-commerce.

Jörn Leogrande, Executive Vice President Mobile Services at Wirecard: "Using our data evaluations and analyses, merchants can increase their metrics in important performance areas. Our previous experience has shown that sales increases in the double-digit percent range are realistic."

Wirecard's turnkey solution generates insights into customer segmentation and cohort analyses, for instance, to optimise marketing efficiency. This revolves around the concept of a data-supported, real-time view of a retailer's customer behaviour in its entirety and increasing the customer lifetime value - optimal customer retention.

Insights into customer attrition (otherwise known as customer churn) behaviour are another unique selling point of the Omnichannel ePOS Suite. Complex evaluations enable merchants to identify customers who may potentially shop elsewhere. By introducing appropriate marketing measures, the churn rate can be significantly reduced.

Analyses on anomalies, trends and sentiment, peak detection and time series based on country-specific data as well as cohort analyses to assess the efficacy of marketing measures are additional beneficial tools. The Omnichannel ePOS Suite can be used in pre-existing systems without incurring large expenses.

Markus Braun, CEO of Wirecard: "The Omnichannel ePOS Suite is the first step towards large-scale digital transformation in the retail sector. Over the next few years, data analyses using artificial intelligence and machine learning will play an increasingly important role in their business area. Based on our analyses, we are able to reduce risks and increase the chances of success for our partners. This means that all parties involved can gain a significant competitive advantage, which is why the omnichannel ePOS suite marks a decisive step for the future of payments."

(Source: Wirecard)

This week, IBM Security and Ponemon Institute released the annual Cost of a Data Breach report.

This year’s report found that the UK experienced a decrease in the cost of a data breach, from £2.53 million in 2016, to £2.48 million in 2017. The average cost per lost or stolen record in the UK is estimated at £98.

Key points from the study include:

IBM has also created a “Cost of a Data Breach Calculator,” which can use below.

(Source: IBM)

We’re living in a data rich world. IBM estimates that 90% of the data in the world today has been created in the last two years aloneThis means it’s crucial that businesses keep control of their sensitive customer data. Tanmaya Varma,  Global Head of Industry Solutions at SugarCRM, illustrates to Finance Monthly the true potential of data use in the financial services sector.

For banks in particular, the safe and efficient storage of data is not just a ‘nice to have’ but a requirement governed by legislation and industry standards. I believe that whether on-premise or in the cloud, banks should strive to capture all their customers’ data together in one place. Why? Because it will empower employees with the right information to give customers the best experience possible.

Bringing together data streams

Perhaps more than any other industry, financial services firm have a huge number of channels to collect customer data from; in-branch, over the phone, via social media platforms. This means they need to have the right data systems in place which can bind together all of their data to build a complete picture of a customer.

The right system needs to bring together front-office data – calls, meetings, leads, opportunities – and back-office data – accounts, transactions, delivery schedules, fulfilment and so on. There is also a need, particularly for capital markets, to have external data integrated, for example LinkedIn data (where did this prospect use to work?) and trading figures.

In terms of where the data is stored, in my experience banks generally choose to keep their customer data in the cloud. No modern business – bank or otherwise – should keep their customer data in siloes, as this immediately breaks a 360-degree view of the customer.

Meeting customer expectations

Today’s customers expect the best experience possible. The instantaneous pace we now live at doesn’t leave much time for patience – so consumers expect an instant response to their demands.  This means customer-facing employees need to have easy access to their customers’ background as soon as the interaction begins, if they are to stand a chance of delivering the best possible experience.

Customers need to know that, regardless of the channel, they’ll receive the same level of service and understanding of their needs and expectations. This all amounts to the overall customer experience, which is crucial when customers are faced with so much choice. The threat of losing customers because of bad service is very real. According to Accenture’s UK research, 34% of customers who switched financial providers in 2014 did so because of a poor customer service.

All customer-facing teams (sales, marketing, customer service and so on) therefore need to have the right tools in place. Technology should empower employees in their interactions with customers; giving them all the information they need, when they need it. For example, providing clear information on the customers’ previous interactions (when did they last contact us? What other products do they hold with us?) – to enable a seamless experience which proves to the customer they are valued and understood.

Turning to technology

Looking ahead, AI will become increasingly important for banks when it comes to the customer journey. Many banks are already open to the possibilities of machine learning – and it has to be said, the capabilities of chatbots is becoming very impressive. Swedbank’s web assistant Nina, for example, now has an average of 30,000 conversations per month and can handle more than 350 different customer questions.

But the customer experience depends on both the quality of the data, and how well employees can use it to then bring insight to their interactions. In my opinion, customer-facing employees and technology should work side by side to enrich the customer experience. The role of chatbots, virtual reality, NLP and so on should be to bring efficiencies to business operations, particularly when it comes to automating tasks and processes where humans don’t add value. In fact, a recent report by Accenture found 79% of banking professionals agree that AI will revolutionise the way they gain information from and interact with customers.

If banks rise to the challenge to store and manage all their data together, and their employees are supported with the right training and technology to quickly access customer data and understand – and even pre-empt – their needs, they’ll be on the path to success.

Technology is bringing the finance industries one step closer to fighting money laundering thanks to the special identification of irregularities in trends and patterns of data, thus creating more 'hits' and fewer 'false negatives.' Aashu Virmani, CMO at Fuzzy Logix here talks to Finance Monthly about the potential impact data analytics can have on fighting money laundering and changing your business for the better.

As long ago as November 2009, Forrester published a research report entitled 'In-Database Analytics: The heart of the predictive enterprise'.  The report argued that progressive organisations 'are adopting an emerging practice known as 'in-database analytics' which supports more pervasive embedding of predictive models in business processes and mission-critical applications.’ And the reason for doing so?  'In-database analytics can help enterprises cut costs, speed development, and tighten governance on advanced analytics initiatives'.  Fast forward to today and you'd imagine that in-database analytics had cleaned up in the enterprise?  Well, while the market is definitely 'hot' it appears that many organisations have still to see the need to make a shift.

And that's despite the volumes of data increasing exponentially since Forrester wrote its report meaning that the potential rewards for implementing in-database analytics are now even higher.

Win Extra Fuel With 4X Kroger Fuel Points Promotion

Given we can deliver our customers with analysis speeds of between 10 - 100 times faster than if they were to remove the data to a separate application outside of the database, we have a 'hard metric' that is very compelling in helping us convince prospects of the value of in-database analytics.  It's what gives us confidence that the shift to in-database analytics as the standard for data analysis is a question of time rather than choice.  Quite simply, the volumes of data that are increasingly being created mean that the only way to process the data and find analytical value is by doing so within the database.  But, as ever, real world examples are the best way to illustrate a point so let's take an unusual one; money laundering.

Banks have a vested interest in ensuring they stay compliant with the regulations in place for catching and reporting anti money laundering (AML).  The regulations have been in place for several years, and it is likely that most large banks have systems/processes in place to track and catch money-laundering activity.  Despite this, we still hear about cases where the authorities have fined reputable banks for their failure to implement proper AML solutions.  Not too long ago, in 2012, HSBC was fined $1.9 Billion by the US Department of Justice for “blatant failure” to implement AML controls related to drug trafficking money and, as recently as 2017, Deutsche bank was fined $650m by British and US authorities for allowing wealthy clients to move $10 billion out of Russia.  So why are current implementations/best practices not keeping up?

Let’s look at 3 big factors that contribute to compliance failure in the realm of anti-money laundering:

With the money at stake for money launderers (according to the UN, $2 trillion is moved illegally each year), the efforts taken by criminals to avoid detection have become incredibly sophisticated.  Organised crime is continually seeking ways to ensure that the process of money laundering is lost within the huge amounts of financial data that are now being processed on a daily, hourly and even by-the-minute basis.  Their hope is that, because so much data is being processed, it is impossible to spot where illegal money laundering activity is happening.  And they'd be right, if you had to take the data out of the database for analysis.

Achieving a good degree of accuracy in a typical large bank means having to analyse billions of data points from multiple years of transactions in order to identify irregularities in trends and patterns. A traditional approach would require moving the data to a dedicated analytical engine, a process that could take hours or days or more depending on the volume of data. This makes it impossible to perform the analysis in a manner that can provide any real value to the organization. With in-database analytics, there is no need to move the data to a separate analytical engine, and the analysis can be performed on the entire dataset, ensuring the greatest possible coverage and accuracy.

One of our largest customers is a leading retail bank in India.  It was experiencing a rapid growth in data volumes that challenged its then-current AML processes.  By not needing to move the data for analysis, we were able to analyse billions of data points over a number of years (3+) of historical data to identify possible irregularities in trends/patterns, and do so in under 15 minutes – faster than any other method.  By not working to a pre-defined set of analytical rules and by letting the data 'speak for itself', it is possible to uncover patterns which occur naturally in the data. As a result, the bank is seeing an improvement of over 40% in terms of incremental identifications of suspicious activity and a 75% reduction in the incidence of 'false positives'.  In short, good guys 1, bad guys 0 because in-database analytics is having a very real impact on the bank's ability to spot where money laundering is happening.

I'm pretty sure that when Forrester published its report into in-database analytics towards the end of the last decade, it didn't envisage the fight to combat money laundering being a perfect case study for why in-database analytics is a no brainer when handling large volumes of data.  But in today's world, with ever increasing data volumes and the requirement to understand trends and insight from this data ever more urgent, in-database analytics has now come of age.  It's time for every organization to jump on board and make the shift; after all, if it can help defeat organized crime, imagine what it could do for the enterprise?

You wouldn’t drink milk if it was five days past its sell-by date. You wouldn’t buy a computer in 2017 running Windows 98. Would you use data that you know is bad, incomplete or outdated? Rishi Dave, CMO at Dun & Bradstreet talks to Finance Monthly about the impact of using bad data, and what makes it bad.

Clearly, the answer here is a resounding no. Yet it seems this is common practice for many enterprises; in 2016, poor quality data alone cost the Unites States $3.1 trillion. Most companies know how important data is – managers, financial decision makers, data scientists and so many others use it every day at work. Due to the constraints of time, some employees simply have no choice but to accept the data they’re given and use it for financial contracts, supply chain management or prospecting new customers.

But this is risky business. A company can have all the data in the world at its fingertips, but realistically, how much of that data is accurate? And how is it being processed? Only by having the right tools and analytics can the consequences of bad data be avoided.

What’s the worst that could happen?

Bad data can mean many things; the data itself could be outdated, poorly formatted or inconsistent.

For sales and marketing teams, they rely heavily on the most-up-to-date, real-time data to allow them to effectively do their job properly. It’s no use calling up the MD of a company, only to find out they no longer work there or now have a different title. This can be incredibly timewasting and fundamentally limits a salesperson’s ability to sell; the average sales rep spends 64% of his or her time on non-selling activities. Wasted time leads to wasted revenue, which means bad data is directly impacting the company’s bottom line.

A vital ingredient to growth

Bad data isn’t just a timewaster, but a growth-stopper. For companies to grow, they need the right data for the right business function. Marketers need to ensure their contact database is up to date, or face stultified growth opportunities. Nowadays, businesses are demanding more intelligent, data-driven, real-time insights to realise higher return; 80% of marketers see data quality as critical to sales and marketing teams and more than half are investing to address persistent data challenges.

Incorrect names or job roles, outdated phone numbers and inconsistent & badly recycled data will actively prevent a company from reaching desired prospects. The Databerg report in 2015 found that medium sized companies were spending £435,000 on redundant, obsolete or trivial data. For SMEs, growth via data could certainly be the difference between black and red. And therefore making sure they have the right data is paramount. After all, if you water a plant with seawater, it won’t grow. Feed it with normal water and watch it flourish.

Data is an opportunity

Data has the power to transform businesses – but feed bad data in to a machine (or company), and you’ll only get bad results. From losing customers, a damaged reputation and decreased revenue, everything is at stake. Of course, no company is immune to human error. But what a company can control is its flow of data and how it uses it.

Most businesses know that they have to act to improve the quality of their data. But the way they do this is flawed; most batch cleanse, but they do this once a year at most. In the current age where data flow is constant and new information about customers, partners, suppliers and the economy is available all the time, data insight is only ever as accurate as the data feeding it.

What’s the answer?

What businesses really need to do with their data is to integrate, clean, link, and supplement it so they have an accurate database on which to build their algorithms. This starts with foundational master data.

Master data is the foundational information on customers, vendors and prospects that must be shared across all internal systems, applications, and processes in order for your commercial data, transactional reporting and business activity to be cleaned, linked, optimized and made accurate. It’s essentially the foundation of your enterprise and without it not only does your AI infrastructure breakdown, but so does your business.

Whether it’s a hospital, a financial institution or a marketing agency, ensuring you have the right quality data must be top of every agenda. Data is an opportunity; don’t waste it.

Data & content management have a visible impact on a business, whether done correctly, in half measures, or not at all, and can have adverse effects on a firm’s reputation, operations, and in the long run, vision. Here Katie Rigby-Brown, VP of global financial services at SDL, discusses the currency of content in a global financial economy.

Arguably in one of the world’s first global industries, financial services organisations are no stranger to the challenges of managing an intercontinental customer base.

However, financial service organisations are now embracing digital revolution in the race to survive against a back-drop of fintech competition, the hangover of reputational damage, and increasing regulatory compliance. Yet this is also presenting new challenges for how firms with a global employee and customer base handle their content.

One thing is certain, whether boutique or behemoth, the need to engage, capture and retain your customers at speed and without compromising data protection, anti-bribery, or corruption law, irrespective of location or language, is now greater than ever before.

Traditionally, financial service organisations have managed their multi-lingual content in one of three ways; using in-house teams, local niche providers or trusted freelancers, or a hybrid of both. A myriad of systems currently exists to create, store and publish content; from managing marketing content and regulatory publications to managing customer information, policy documentation and learning programmes. At best, the sheer volume of options available causes businesses to overspend much needed cash. At worst, irreparable reputational damage is caused and companies receive financial penalties for non-compliance.

Many will know that while this silo model worked historically, it is not a scalable solution for today’s environment. The rate of content production has increased sevenfold and added to increasing regulation, often unique to each country; this model simply isn’t sustainable anymore.

As the industry becomes more digitalised, forward thinking, and interconnected, and at a rate incomprehensible to most, we must ask what this means for business and how to succeed in such an environment.

One way financial service organisations can succeed in this new digital environment is by taking control of their content ecosystem. A piece of content goes on a long, protracted journey, passing through content optimization software many different hands before it reaches its target audience. By having a strong grasp on where all content is and who it is with, from beginning to end, businesses reduce the risk of non-compliance with market or data regulations.

Having a tight grasp on the content ecosystem from end-to-end is critical for organizations who want to avoid fines or reputational damage. When content management is a top priority, the chance of physical or electronic content going missing or falling into the public domain is greatly reduced.

Another crucial way to succeed is to leverage your own existing assets. This is a catch-22 for many companies who may have style guides for some areas but not for others - standardising these can be a challenge, especially when trying to maintain legacy. However, customers today expect companies and brands to communicate to them in a certain tone of voice that still manages to be personable.

The easiest (and cheapest) way to make the most of existing company assets and to take control of brand tone of voice is by maximising the investment that’s already been made. Most companies already have an established tone of voice. But instead of starting from scratch, creating style guides that show how to use that tone of voice in target markets will ensure the hard work already done doesn’t go to waste. This will also help to reduce the chances of being non-market compliant in those territories.

It may sound obvious, but using the right technology for the right use case is essential. A mixture of on premise, saas or hybrid solutions that support your content classification and organisational appetitive for cloud will allow you to respond to the challenges presented by agile fintech competitors, personalising, targeting and protecting your content.

Understanding the challenges that need to be overcome and finding the right technological solution to solve them could be the difference between a successful campaign and a reputational crisis. While financial service organisations are no stranger to dealing with global communities that require different content usually all at the same time, organizations that do not embrace the speed of change will fall foul and ultimately fail. Can your business risk playing fast and loose with data and brand reputation? The likely answer is no – so take action now before it is too late.

Established in 1988, Target Professional Services is a UK-based company providing Data Cleansing and Verification solutions to the financial sector.  Target verifies that common data is accurate, complete and up-to-date. Where records are found to be out-of-date, Target are able to accurately trace and verify the data to ensure records held are always compliant with GDPR and other regulations within the Finance sector and in particular, The Pensions Regulator record keeping guidance. Here Lisa talks to Finance Monthly about the company’s services, the upcoming GDPR and its impact on the business, and her role in growing Target into a leading data verification and trace company.

 

With the EU General Data Protection Regulation (GDPR) scheduled to come into effect in May 2018 – what would you say will be the impact that GDPR will have on businesses?

The new regulations will require greater data accuracy and accountability. The potential to fine and the size of fines that can be imposed are significant, so GDPR should not be overlooked and needs both focus and a budget within any organisation.

 

What have Target Professional Services done to ensure that the company will demonstrate compliance with the directive in its entirety?

First of all, Target have reviewed and updated all of our internal processes where GDPR will require change. In addition, we are checking our suppliers to ensure that they will be compliant for the new regulations, so we are clear that we are using consented data. We know that some datasets will require individuals consent to continue to be used, so we are looking to ensure that consent is obtained or that type of data is not used.

In what ways can the company’s services assist others with becoming fully-compliant?

We are sharing our experience and understanding with our existing clients so they are clear about GDPR. We are constantly finding different levels of understanding throughout our client base and we work with them to improve their knowledge.

 

Could you tell us a bit about your career path?

Leaving school at 16 with 10 GCSE and unable to afford to go to University, I started work with Halifax Building Society and by 18, I had been promoted to Department Manager. However, I took the decision to leave the Halifax, as my aspirations were not in banking. At that time my father had invented a high-pressure valve cap for vehicles. He needed a BS5750 certification, so I studied the requirements and wrote his manuals for him. I also worked as a part-time book keeper for my mother, who ran a small independent debt collection agency, while I studied Accountancy, Law, Economics and credit control at night school. After successfully building a computerised accounts system for my mother, I identified a need in the market to transfer manual accounts to a computerised system and went on to support other businesses to successfully migrate their accounts data. With the merger of several rental companies in 1997, the debt collection business expanded, as did my role. Along with designing and implementing the CRM database to support the expansion, I took over the management of the Customer Service and Field Operations, before finally buying the business in 2001.

 

You’ve managed to build Target from a small debt collection business to a leading data verification and trace company – what were the challenges that you were faced with and how did you overcome them?

The debt market was very competitive and I had one very large client when I took over the business.  I knew that I had to change the dynamics and the markets the company operated in. We entered the Pensions Market bringing innovation and competitive pricing at a time of regulation change. Target has focused on Customer Service, Data Quality and flexibility to ensure that our business does not become stagnant and stale. We bring innovation to solve the problems legislation brings to the industry and to ensure that our clients are always ahead of any changes.

 

What would you say are the company’s top three priorities towards its clients? How has this evolved over the years? 

Our philosophy in working with our clients remains the same today as it’s always been. We look to develop long standing working relationships with all of our clients and understand what they require from us. Every client is different so we also look to be flexible in order to suit each client’s needs.   Target has always been industry innovators and this is still a driver for us today, as tracing and data availability changes and develops.

 

Looking into the rest of 2017 and beyond, what does the future hold for you and Target?

We see opportunity to apply what we do to many different industries, especially with GDPR soon upon us. We predominantly work in the financial services sector and then mostly, in the pensions sector, but tracing and data screening is of value elsewhere. We are exploring such opportunities and offering solutions in new markets. Contact us if you think we can help you. Through a partnership approach we may be able to offer you a service that gives value to what you do.

 

With GDPR just around the corner (May 2018), the new EU rules are probably something you want to start thinking about, and companies could risk serious vulnerability in the face of data protection. But do the rules require you to hire a data protection officer? Richard Henderson, global security strategist at Absolute, provides Finance Monthly with the expert tips you’ve been looking for.

In just over a year the EU’s General Data Protection Regulation (GDPR) comes into effect, with part of it stipulating that some organisations will need a data protection officer (DPO). Impacted companies that haven’t already assessed their data protection technology, policies and processes against the regulation’s mandates, need to take action now to address any shortcomings.

The regulation may have been four years in the making, and amended throughout the process, but what has been clear from the start is that it intends to define an era where lax data management is not tolerated. The letter and spirit of the regulation reflects an expectation that data protection should be a priority, not an afterthought. Individuals’ rights around their data will be strongly upheld and companies found wanting will face tough punishment.

In this, the financial services sector has some experience. Despite being responsible for a relatively small percentage of the total security breaches reported to the Information Commissioner’s Office (ICO) in 2015-16, it attracted a third of the financial penalties the ICO pursued. With fines for data protection non-compliance set to rise significantly under GDPR (up to four per cent of annual global turnover), the industry cannot afford not to take note and to prepare.

The overall aim of GDPR is to make EU privacy laws fit for the 21st century. While there is a major emphasis on enforcement it also introduces mandatory data breach reporting requirements, in some cases within a challenging timeframe of 72 hours.

 

The role of the data protection officer

The requirement to appoint a data protection officer (DPO) is summarised as being in the case of “public authorities,” “organizations that engage in large scale systematic monitoring” and “organizations that engage in large scale processing of sensitive personal data”.

Organisations meeting these requirements will need to make someone responsible for data protection. It will be extremely important to have the right person for the job so legal advice should be considered when hiring.

The DPO must have expertise on data protection law and practices, is expected to keep their knowledge up to date and to report directly to the highest level of management. In short, this is not a responsibility to be taken lightly or to be tagged onto an existing role where the necessary level of expertise, knowledge and responsibility does not already exist. ​It is a professional role, expected to be accorded a sufficient level of seniority, with standing in the firm and the resources to maintain and build on knowledge.

DPOs will need to be supported by a thorough assessment and (where necessary) overhaul of policies, processes and procedures to ensure GDPR-readiness. A big part of their job will be ensuring the right technology is in place to prevent data breaches, while maintaining and reporting on security.

 

Enough is not good enough

The cyber-attack threat landscape continually changes, forcing businesses to evolve their security strategies and policies to keep up. The risk of non-compliance with GDPR is simply too high, not just in terms of potential financial impact but also corporate reputational damage from compromised data. A DPO will be central to safeguarding the organisation’s reputation, maintaining the right technology and ultimately, preventing a large-scale data breach.

GDPR recognises that situations have changed immeasurably since its preceding 1995 Data Protection Directive when the internet was still in its relative infancy. Today, larger volumes of data are not only created and stored but also widely transferred and held on mobile devices.

GDPR had to bring data protection enforcement up to date for the modern day. By setting the fines level for infringements at the level it has, it is sending out a clear message that ‘enough’ is not good enough. Companies need to make data protection part of the fabric of their organisation or pay the price for not doing so.

The price could be hefty indeed for UK business. If cybersecurity breaches stay at the level reported in 2015, fines could rise from £1.4 billion to £122 billion, according to the Payment Card Industry Security Standards Council.

Companies with limited IT knowledge and expertise may feel that punishments meted out after the event should be balanced by guidance and instruction on breach prevention, so that they can prevent falling foul of the regulation. While it is rightly incumbent on companies to adequately secure data, the options available to them to do this are matched only in their number and variety by the methods hackers have for getting in.

EU GDPR is incontrovertibly punitive but companies looking at it in full must see the opportunity the regulation gives to them to avoid incurring penalties.

 

Taking stock

By interpreting what the measures require companies to do, they can take action to keep data safe and thereby avoid non-compliance. This includes putting in place processes to provide data to subjects if they ask for it and to remove records if requested when it’s no longer necessary to hold them. It includes potentially putting in place the data protection officer and - perhaps above all - mandates ‘privacy by design’, meaning that data protection has to be built in to systems when they are designed rather than afterwards as an add-on.

This last measure is – if any were needed – the clearest indication of the regulator’s intention to instil into all companies a culture of data protection, one that drives systems and processes rather than the other way round.

A designated DPO dedicates a level of time and expertise that is required now for robust data protection. After all, 72 hours to report a breach is a short space of time and staying on top of policies and processes around data retrieval, access and removal is a big job. Organisations need the capabilities in place to manage data across their entire device estate. A single point of contact with specified responsibilities stands to help the company at the same time as helping the regulator.

Above all else, a dedicated data protection role will help companies prevent data issues, safeguard their reputation and avoid potential non-compliance.

For one particular part of the financial services sector, GDPR presents a specific opportunity. Strict new rules should mean the cyber insurance market will grow. With breaches set to be more widely reported under the new regulations, more data will be available to insurers to set premiums so we are likely to see an increase in the number and range of cyber insurance offerings.

Companies concerned by the length and breadth of the EU GDPR should step back and consider that, in simple terms it obliges organisations to put in place security measures appropriate to the risks. If a data breach occurs it will be hard for that organisation to argue that it had done this. Therefore, the goal will be then what it is now – to have in place the resource, policies, processes and technology to prevent breaches.

Companies should reassess how they detect suspicious activity on their network and consider options for persistent connectivity and encryption for systems, devices and data. The threat of higher fines certainly focuses attention on data protection but in reality, it must always be a top priority for the financial services sector.

No one wants to have their good company name smeared in the headlines because of a breach or incident that could have been avoided. It’s up to all of us in the security space to ensure that we are doing everything we can to keep the data entrusted to our protection safe from harm. We owe it to ourselves, our shareholders, and the public who trust us to steward their most sensitive of data.

About Finance Monthly

Universal Media logo
Finance Monthly is a comprehensive website tailored for individuals seeking insights into the world of consumer finance and money management. It offers news, commentary, and in-depth analysis on topics crucial to personal financial management and decision-making. Whether you're interested in budgeting, investing, or understanding market trends, Finance Monthly provides valuable information to help you navigate the financial aspects of everyday life.
© 2024 Finance Monthly - All Rights Reserved.
News Illustration

Get our free weekly FM email

Subscribe to Finance Monthly and Get the Latest Finance News, Opinion and Insight Direct to you every week.
chevron-right-circle linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram