Warning: Undefined array key "url" in /var/www/html/wp-content/plugins/wpforms-lite/src/Forms/IconChoices.php on line 127

Warning: Undefined array key "path" in /var/www/html/wp-content/plugins/wpforms-lite/src/Forms/IconChoices.php on line 128

Warning: Cannot modify header information - headers already sent by (output started at /var/www/html/wp-content/plugins/wpforms-lite/src/Forms/IconChoices.php:127) in /var/www/html/wp-includes/feed-rss2.php on line 8
C-Centric https://www.ccentric.co.uk Creating Smarter Interactions Tue, 16 Sep 2025 13:16:03 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 C-Centric nominated for the “Most Innovative use of AI” award by Data IQ judges. https://www.ccentric.co.uk/c-centric-nominated-for-the-most-innovative-use-of-ai-award-by-data-iq-judges/ Thu, 11 Sep 2025 13:32:01 +0000 https://staging.ccentric.co.uk/?p=3612

The Data IQ judges have nominated C-Centric for “Most Innovative use of AI”  award. This is for the development of agentic BOTs that advise and recommend options to consumers on utility switching.  The agentic BOTs ingest consumer browsing behaviour , socio demographic attributes & home value data to help with conversational recommendations. The agentic Bots then transfer the consumer to the supplier E-Commerce check out with selected product details populated. In addition it initiates and transfer the conversation to the supplier chat agents to complete the transaction. No conversation contest is lost. In A/B test this has increase conversion by up to 89%.  

Share
]]>
Why You Should Re-Platform Your Customer Marketing Database To Snowflake https://www.ccentric.co.uk/why-you-should-re-platform-your-customer-marketing-database-to-snowflake/ Thu, 11 Sep 2025 13:03:01 +0000 https://staging.ccentric.co.uk/?p=3608

We are often asked by CDOs and CMOs to document the business case for moving their marketing data platform to Snowflake. In this article I explore the key reasons why you should actively consider re-platforming your customer marketing database onto Snowflake.

 

Enhanced Performance and Speed

Marketing databases often struggle with processing and analysing large datasets, leading to bottlenecks and slower query response times. Snowflake processes queries using massively parallel processing (MPP) compute clusters. By providing elastic access to scaled compute at low cost it addresses some common speed and performance challenges in customer management.

 

There are 5 areas we find as the most transformative for clients:-

 

  • Campaign execution & performance
  • Cost
  • Real time personalisation
  • Model refresh
  • Self- serve customer and campaign MI.

 

 

Improved Campaign management

 

By re-platforming your customer marketing database to Snowflake you get a quantum leap in performance and speed when running campaigns.

This is pretty critical as today’s customer marketing campaigns run off significantly larger data volumes, more complex data structure and wider range of data types than even 2 years ago. Campaign selections over large data sets that would have timed out or taken hours – now will run in seconds on Snowflake.

 

Snowflake’s support for diverse data types and semi-structured data empowers your marketing team with the capability and speed to harness the full spectrum of customer information. This includes unstructured data from devices, messaging, social media, clickstream data and more.  We have helped clients build high performing campaigns that query customer service chat, email & voice call transcript directly from nested JSON data in Snowflake. Its ability to function as a data lake query engine provides great scope to incorporate additional data for campaigns at speed.

 

Cost

What often surprises clients is the cost benefits for such increased performance. Usually there is line straight cost saving at time of migration –  especially of you have been on a managed service with an outsourcer.

Key FD friendly features

  • Pay for what you use & avoid over- provisioning – Snowflake’s pricing model allows you to only pay for the resources they use, which can save money in the long run. You can automatically scale up or down their usage based on their needs, so there’s no risk of overprovisioning.
  • Snowflake offers cost and workload optimization features help you to enforce cost control, and discover resources that need fine-tuning.
  • Eliminate legacy software license fees

 

Many of the major Martech tool vendors run directly off the Snowflake data engine. 

For example we often set up Adobe Campaign to run directly off the Snowflake data engine. In addition to significant speed improvement we also eliminate the wasted effort, time and latency of data wrangling into campaign marts.

Most of the leading technologies in the modern martech and adtech space will talk directly to the Snowflake data engine.  These Martech tool vendors makers are driven by the need to get closer to the unified customer data store, equipped with native processing capabilities. This also means that you are not locked into any tool vendor. You can plug and play best-of-breed tools off your centralised data spine. This is why many CMOs favour going down a composable CDP route instead of a silo packaged CDP. By working with tools that are closer to the data, you’re optimizing the expensive time of your data engineers. You’re cutting the time and expense of getting your campaigns right. Marketers shouldn’t be hampered by data friction. Instead they should be working on what they love to do: delivering differentiated campaigns with optimal speed, accuracy, and agility.

 

Matching

Snowflake supports non-SQL code within Snowpark for complex transformations and simplifying data integration. We use our data matching & identity resolution platform AudiencePlus within Snowflake to link records and consolidate diverse datasets into a single source of truth. This breaks down data silos, enabling a holistic view of customer interactions and preferences. It uses AI to recognise partialand misspelt data from names, address , emails and contact numbers. It utilises a UK data universe to append data fields (forename, DoB, email, mobile) to assist matching algorithms and record linking . This provides a tunable matching environment for both deterministic and probabilistic matching.

  • Link accounts for one person
  • Create household views
  • Link a customer’s digital activity across Web and app
  • Bridge anonymous to known IDs
  • Match and overlay external commercial data by name & address & other match keys
  • Increase onboarding match rates with Google/Facebook & other ad platforms

 

Personalization data layer  

With a single customer view, you can drive great value through both digital and inbound calls. Snowflake supports Hybrid tables enabling high speed row level data look-ups as well as storage optimised for analytical queries. This means you don’t need multiple data technologies – you can drive on-site personalisation, customer identification and NBA within Snowflake without the overhead of synching multiple data marts with all the associated latency.

 

Machine learning and analytics

Snowflake’s support for machine learning and data science workflows further amplifies the value of your customer data, enabling predictive modelling, customer segmentation and propensity scoring. The ability to leverage advanced analytics within Snowflake’s environment eliminates the need for data movement, streamlining your analytics workflows and accelerating time-to-insight.

The game changer is to work across large volumes of data with rich intent signals and refresh models at great velocity to drive critical time sensitive customer campaigns and personalisation.

We have used these capabilities in telecommunication clients to build customer churn radars. Our models were able to detect customer disengagement from hundreds of millions of rows of PAYG transaction usage data.  We were able to trigger push messaging campaigns for data vouchers for those at risk of lapsing. These models driven by twinkling big data feeds can be game-changing campaigns in churn prevention and cross sell.

 

Gen AI

A key new feature of Snowflake is its role as a vector store with Gen AI capabilities

We classify and label inbound customer voice & chat transcripts using Cortex functionality. This allows us to detect multiple signals within conversations – call drivers, dissatisfaction, underlying cause of complaint, vulnerability and cross sell opportunities. Our models use this data with additional customer variables for highly effective triggered churn and cross sell customer campaigns.

  • Speed- by using Snowflake and Arctic LLM we can achieve faster throughput at far less cost than many other foundation models.
  • More accurate – trained industry specific classifier models
  • Data security – no 3rd party data transfer – data stays in the Snowflake environment – no need for additional tech clutter and the complication of data transfer.

 

Streamlined Data Management and Collaboration

The migration to Snowflake redefines the landscape of collaboration within your marketing organisation. Snowflake’s cloud-based data platform offers a unified data spine and environment for storing, processing, and sharing all customer data. Secure data-sharing facilitates seamless collaboration and knowledge sharing across teams like advertising, operations and data sciences.  We have been able to help clients to securely share select datasets with external affinity marketing partners, agencies, or media vendors, fostering collaborative marketing initiatives and improved media ROI.  We have helped companies run some great affinity partnerships with directly measurable sales by match-back through secure data share. This frictionless data sharing empowers your business to leverage external expertise and enrich your customer insights, driving innovation and differentiation in your marketing strategies.

 

MI and Campaign analysis

Cloud analytics has emerged as a revolutionary approach to MI data self-serve through tools like Tableau, Power BI etc.  We commonly are asked to pair the Snowflake’s cloud data warehouse with BI tools. Customer operations dashboards and campaign analysis is one of the most challenging areas for BI tools. These projects have been traditionally plagued with slow speeds, incomplete data and conflicting results from source system reports. Snowflakes provides users with speed on large volumes of data through elastic compute and support for both structured and unstructured data. Net result – is more timely and consistent metrics across a wider set of data. One key feature we love is Snowflake’s Time Travel feature allows you to analyse data at different points in time, which is invaluable for historical trend analysis. This can provide insights into how your data has evolved and help you make better data-driven decisions

 

Conclusion

By investing in Snowflake, you are planting your flag in technology that everyone can invest in for the future and minimise the number disparate data technology repositories.    The enhanced performance and speed offered by Snowflake not only streamlines your marketing operations but also enhance the overall customer experience. By leveraging real-time insights, you can personalise marketing communications, deliver targeted promotions, and respond promptly to customer interactions, fostering stronger relationships and driving higher engagement with your brand.

Share
]]>
Snowflake Unistore and Hybrid Tables – What Are They and How Can They Benefit Your Organisation? https://www.ccentric.co.uk/snowflake-unistore-and-hybrid-tables-what-are-they-and-how-can-they-benefit-your-organisation/ Thu, 11 Sep 2025 12:56:54 +0000 https://staging.ccentric.co.uk/?p=3604

In this article I want to explain how Snowflake’s new hybrid table support and Unistore architecture  opens up exciting new options for both Martech and Adtech use cases.

 

Hybrid tables

Since its release in 2015, many Snowflake-based data warehousing solutions have involved all OLTP being performed on external transactional databases such as Postgres, MySQL, or SQL Server, with period ETLs into (or out of) a centralised Snowflake data warehouse. Only once the data is in Snowflake (or similar DWH platform) can OLAP be performed on the transactional data. Similarly, performing OLTP on Snowflake OLAP data products involved ETLs from Snowflake into a transactional database. While this solution works well for the most part, it is not without its caveats. Namely:

  • Data latency between the two systems.
  • Maintenance of ETL pipelines.
  • Cost of executing ETL pipelines including request charges on the source database or Snowflake, data processing costs of a compute instances such as an EC2 instance, and data egress charges.

In aid of this, Snowflake has been rapidly developing a range of new features to enable workloads that go beyond the OLAP that Snowflake was initially intended for in 2015, bypassing the need for disparate data sources and complex ETLs. Amongst these new features are Hybrid Tables. While not a novel concept in the world of research (first described by Dr. Hasso Plattner in 2009), in the world of industry, Hybrid Tables are a recent development. On Snowflake, they were first made available in private preview in 2022 and later released into GA in Q4 last year – making now a great time to dive into the implications of this technology in the Snowflake ecosystem and the wider community.

Hybrid tables enable OLTP to be performed within Snowflake and integrated seamlessly with OLAP in a “Unistore” workload, facilitating a range of OTLP use cases from serving transactional data at high concurrency to a web application to business-critical financial transaction systems. The architecture satisfies all requirements of such systems, including entity/referential integrity and high-concurrency point read/write throughput. At the same time, analytic workloads can be carried out just like any standard snowflake table, completely asynchronously (without interruption to) the ongoing high-concurrency transactional processes that keep the application running at low latency.

 

Columnar vs. Row Storage Architectures (OLAP vs. OLTP)

In standard Snowflake tables, data is organised into compressed immutable columnar files in object storage (S3, Azure Blob, GCS), with one file per column per micro-partition (a logical and physical clustering of records) and each micro-partition containing tens of thousands to millions of records. Snowflake maintains micro-partition metadata including the range of values stored in each of the columnar files. When a query or DML operation with a filter predicate (e.g. ‘where’ clause or ‘join’ condition) is executed against the table, Snowflake does a lookup against the micro-partition metadata such that only files whose value ranges overlap with the filter predicate are downloaded to the warehouse to be scanned. This type of ‘pruning’ works exceptionally well for OLAP workloads which typically involve operations over large spans of a column – such as joins or aggregation. This is for a few reasons:

  • The process of clustering and pruning described above greatly reduces the amount of data that needs to be scanned, particularly when using predicates over natural dimensions of the data such as dates.
  • Columnar compression greatly reduces the volume of data being transferred over the network when table data is being copied over from object storage to the compute warehouse.
  • Columnar storage means only the columns specified in the select clause are transferred.
  • Warehouses cache table data and interim results such as outputs from joins to expedite sequences of similar queries.

Since the columnar files are immutable, when DML operations are executed against the table, all micro-partitions containing affected records are locked for the duration of the operation – even if only a single record is being updated. This means that if process A is updating a single record when process B executes a DML operation on a single record out of the thousands or millions of records that happen to reside in the same micro-partition, it cannot do so until process A has:

  • Copied the files containing possibly millions of values from object storage to the warehouse
  • Decompressed the files
  • Scanned the files
  • Processed the files to update the record
  • Compressed the new created files containing the updated field (remember the columnar files are immutable).
  • Copied the new file over to object storage (S3)

While this process is efficient on OLAP ‘bulk’-type workloads, as they do not tend to have many concurrent low-volume writes, this architecture would result in poor performance for OLTP-type workloads which typically have high volumes of concurrent random point (single/few records) writes. Similarly for point-read operations, working with columnar files containing thousands to millions of values is also extremely inefficient.

On the other hand, traditional OLTP-optimised databases such as MySQL and Postgres use a row store in which records are stored completely independently of one another. As a result, they have the following properties:

  • Row-level locking, as opposed to micro-partition level locking, gives this architecture far greater efficiency when dealing with many concurrent random point read/write operations – especially on smaller tables.
  • Only the relevant records are retrieved during read/write operations, as opposed to the whole micro-partition. When a single operation only deals with a small number of records (as is typical in OLTP), the result is that the volume of data transfer is far smaller due to the reduced redundancy. It should be noted that when dealing with many records per operation (as is typical in OLAP), the columnar compression and column separation outweighs the redundancy.

 

Enforcement of entity and referential integrity are another essential requirement for guaranteeing data correctness in OLTP. While Snowflake allows users to ‘define’ primary keys on standard tables, this is only descriptive as there is no built-in functionality to enforce primary key uniqueness or referential integrity on standard Snowflake tables. On the other hand, traditional row-store-based databases such as MySQL or Postgres have entity and referential constraints built into them. This means that any operation that tries to break these constraints, such as deleting a record whose primary key is a foreign key in another tables – resulting in an ‘orphaned’ record, would be blocked and result in an error. I spent some time working in a data acceptance testing team for a migration over to Snowflake and found this to be a recurring issue in the presentation layer for an Adobe Campaign Monitor backend.

Snowflake Unistore

Hybrid Tables combine both columnar and row storage architectures into a single logical database object. With the row store as the primary storage, Hybrid Tables satisfy the referential and entity integrity requirements of OLTP and are well optimised for typical OLTP high-concurrency point read/write workloads. Asynchronously, a secondary columnar object storage is maintained. This is identical to that of Snowflake’s standard tables, meaning more Snowflake-typical OLAP workloads involving large scans can be done directly on the table without interruption or performance impact on the ongoing OLTP.

While this may sound complex, Snowflake users only get a single view of the logical Hybrid Table, even though it comprises two underlying data structures. When a query is executed against the Hybrid Table, the query optimiser automatically chooses on which data structure the operation will take place.

Optimised Bulk Loading

Snowflake users have the option of using optimised bulk loading to load data into Hybrid Tables. This method is significantly faster and most cost-effective than loading data into Hybrid Tables incrementally (depending on your solution) when dealing with loading large volumes of data into the table. At the time this was written, optimised bulk loading only kicks in on the initial load into the table – i.e., even if the table is empty but there were records that were deleted, optimised bulk loading will not be used.

Until recently, a limitation of Snowflake Hybrid Tables was that bulk loading could not be used in conjunction with foreign keys as it was only supported by CTAS statements. Only in January 2025, Snowflake announced support for optimised bulk loading with INSERT INTO and COPY INTO statements (provided the table has always been empty), whereby the user can define the foreign keys in the CREATE TABLE statement and then bulk load into it. Further, Snowflake have announced that they intend to add optimised bulk loading for incremental batch loads in the future.

Consistency and Latency Between Row and Columnar Stores

Users can choose between a session-based or global consistency model via the “READ_LATEST_WRITES = true/false” option. If set to false, the default, users can expect data staleness of up to 100ms between sessions (from row-store to columnar-store), and zero staleness within the session. If set to true, there is no data staleness however the latency of operations on the row store may increase by a few milliseconds (according to the Snowflake doc). Ultimately this depends on use case, but in most OLAP scenarios 100ms is negligible.

Cost

Snowflake uses the same compute pricing model regardless of table type. Accounts are charged based same per-second billing of the compute warehouse used for processing on the Hybrid Table. Generally, OLTP should be done on a Hybrid Table using a multi-cluster XS warehouse and scaled ‘out’ rather than ‘up’ with workload – meaning increasing the MAX_CLUSTER_COUNT parameter rather than the warehouse size when defining or altering the warehouse.

Snowflake charges users $40 per compressed TB (at the time this was written, depending on the Snowflake Edition). Users should expect to incur an additional storage cost due to the dual data structure architecture. The secondary columnar storage cost is the same as it would be if it were a standard Snowflake table. In addition to this, users must pay the cost of the row storage – which tends to be higher due to the lack of columnar compression. Therefore, users can expect to pay more than double the storage cost of a standard Snowflake table of equal size, given that they are paying for the combined storage cost of the two architectures.

The solutions that a hybrid-table-based methodology would be replacing would typically involve:

  • An RDS (or Azure/GCP equivalent) instance hosting the application database – these can be quite expensive.
  • ETL costs for:
    • Execution of extraction queries on the source database.
    • Costs of running a compute instance (or lambda function) to execute extraction queries and possibly perform some data transformation.
    • Data egress costs of moving data from the RDS instance to the compute instance and onto the Snowflake stage.
    • Snowflake compute costs for ingesting the data.

Use Case

With recent improvements in Snowflake’s Hybrid Table technology and release into general availability, the number of organisations incorporating Hybrid Tables into their solutions has grown rapidly.

We have implemented Hybrid tables for use in a real time personalisation system for a ticketing platform. We needed to provide recommendations & promotions within the user visit pre and post ticket purchase. These promotions were part of a Retail Media implementation  where we managed by a graph that allowed 3rd party organisers/promoters to select audiences and define on site promotions for their events. By serving promotion treatment from Hybrid Tables we reduced lookup latency. It also allowed us to maintain unified governance of  data . All sensitive data was kept within Snowflake & we avoided data wrangling and synch with 3rd party data stores.  Event transactions from these promotions (impressions/clicks) captured in hybrid data was available for instant analysis via the Unistore & OLAP tables. The performance of Snowflake was key as we had hundreds of concurrent reporting users across the 3rd party advertiser base.

Snowflake is an excellent environment for generating recommendations for customers – especially with its latest efforts with Snowpark ML and Snowflake Cortex. Precomputed treatments can be bulk generated based on feeds from feeds  as well as customer browsing behaviour. A Task (basically Snowflake’s cron job service) can schedule ‘optimised bulk loads’ into the Hybrid Table using a CTAS statement. Single-customer recommendations, which are point-read operations, can then be queried by the web application’s business logic, serving fresh OLAP-generated recommendation data at the “double digit millisecond” latency required by such applications. What really makes this a great use case is that previously you would have had to load  precomputed treatments back into some relational database system on a much less frequent basis. With this solution, promotional treatments can be generated and refreshed on a much more frequent basis with much lower latency.

 

 

Key Limitations

In their documentation, Snowflake go into detail about the current limitations of Hybrid Tables. Here, I’m just going to outline the three most important limitations that you should consider if you are thinking about implementing Snowflake Hybrid Tables as part of your solution.

AWS Only

At the rate of new features being added to Hybrid Tables, it’s likely these will come to Azure and GCP at some point, however I could not find any mention of any plans to add this technology outside of AWS.

Data Quality & Constraints

This should go without saying but the level of data quality required in a Snowflake Hybrid Table is higher than what is required from a standard snowflake table. Referential integrity, primary keys, uniqueness constraints, and stricter constraints on COPY INTO statements must all be adhered to (like any traditional RDBMS like Postgres) so you may need to do significantly more data processing and cleaning to get the data into a format that is acceptable for OLTP.

Quotas and Throttling

While Hybrid Tables have continuously seen significant improvements in their performance in the last year or so, the number of read/write operations per second is still capped at a quota of only 8,000 operations per second in a balanced 80/20 read/write workload. By contrast, under optimal conditions, typical OLTP-type systems such as Postgres can handle millions of requests per second. If your sole requirement is to maximise performance under and OLTP workload, then Snowflake Hybrid Tables are a long way off in this regard. With this, it is important to note that transactional databases seen decades of incremental optimisations, whereas Hybrid Tables are a relatively new development and are likely to continue to see rapid improvement in the next couple of years.

Conclusion

At present, the Snowflake Unistore is not a one-for-one replacement for transactional databases in all OLTP workloads, and deciding whether this technology is a good fit for your organisation’s solution will require careful consideration of their strengths and limitations – especially regarding throughput. With that being said, even in their infancy, Hybrid Tables have already seen adoption by a range of organisations with a variety of use cases – speaking to the potential unlocked by the ability to seamlessly integrate OLAP with OLTP in a single environment, and with low/zero staleness between the two. While the concept of Unistore has existed for some 15 years, only recently have we seen platforms such as Snowflake, BigQuery, and Power BI include the technology as part of their offerings. Although there are still some limitations, rapid advancements in the technology, combined with growing adoption by organisations, indicate that Unistore and Hybrid Tables may see much more widespread use in the future.

Share
]]>
CCentric retains ISO 27001:2022 Certification https://www.ccentric.co.uk/ccentric-retains-iso-270012022-certification/ Thu, 11 Sep 2025 09:33:18 +0000 https://staging.ccentric.co.uk/?p=3587

C-Centric passed its annual recertification of the ISO 27001:2022 certification – the world’s best-known standards for information security management systems (ISMS).

Achieving ISO 27001 certification involves undergoing a thorough assessment by an accredited certification body to ensure compliance with the standard’s requirements. It provides a formal recognition that a business has effectively implemented information security controls and practices.

“ISO 27001 is a key element of our technological roadmap and represents our dedication to safeguarding our customers sensitive data and ensuring the highest standards of security and compliance,” says David McKee Technology Director

“Our commitment to robust information security and operational excellence drives us to continuously enhance our processes, invest in cutting-edge technologies, and foster a culture of vigilance throughout our company. With ISO 27001, we have fortified our position as a trusted partner, providing peace of mind to our customers and reaffirming our relentless pursuit of maintaining the highest levels of security and trust in everything we do.”

ISO 27001 covers various aspects of information security, including risk assessment and management, asset management, access control, cryptography, physical security, business continuity, and incident management. By implementing ISO 27001, C-Centric has demonstrated its commitment to protecting the confidentiality, integrity, and availability of our business and customers’ information assets.

ISO 27001 certification is currently the most widely adopted international information security standard used by companies worldwide. By following ISO 27001, businesses can be confident that their Information Security Management Systems (ISMS) are up to date and comply with current best practices.

Share
]]>
Webinar: Case study – building a Gen AI Digital assistant for telco switching https://www.ccentric.co.uk/webinar-case-study-building-a-gen-ai-digital-assistant-for-telco-switching/ Tue, 05 Aug 2025 09:41:11 +0000 https://staging.ccentric.co.uk/?p=3595 ]]> New Ai Chatbot for comparison sites https://www.ccentric.co.uk/new-ai-chatbot-for-compariso-sites/ Fri, 22 Sep 2023 09:57:29 +0000 https://www.ccentric.co.uk/?p=3390

New Ai Chatbot for comparison sites

C-Centric has launched the first-ever AI chatbot designed to facilitate seamless ‘chat led’ broadband switching between comparison and provider site. This has been delivered after close collaboration with Stickee, the tech company specialising in supporting major broadband comparison sites, and Virgin Media, a leading broadband provider

 

Reassurance

With the cost-of-living crisis, consumers are increasingly looking to switch providers for better value deals. However new fibre options, T&Cs, mid contract price rise clauses can lead to uncertainty and ‘choice paralysis’. C-Centric’s solution is designed to address these challenges head-on. The PrediCX Ai chatbot serves as a knowledgeable guide, assisting customers in navigating through their switching anxieties. If they need specific help from the provider or are ready to proceed, the C-Centric ChatBot can seamlessly transfer the customer chat conversation and enquiry data directly over to Virgin Media’s chat agents.

 

Redefining Customer Service

This innovative approach seeks to redefine the concept of customer service within the telecommunications industry. As Simon Perrin of C-Centric explains:

“Whilst every telecommunications company aims to empower customers to self-serve, we recognise the reality of the challenges consumers face. We have harnessed the power of our PrediCX Ai software to create a solution that not only supports customers within the comparison site but also links seamlessly from the comparison BOT direct to the provider’s agents, providing a truly streamlined experience.” We know from recent research that over 75% of consumers are more likely to make a purchase if they could get answers from messaging. This solution provides consumers with a joined-up messaging experience across organisations. Context is maintained and the tedium of repeating information is eliminated.”

 

A Collaborative Effort

The development of this AI chatbot was not without its challenges. It required a collaborative effort from all parties involved, as Craig Mitchell, the commercial lead from Stickee, highlights:

“Integrating such a ground-breaking solution into our brand-new interface required our teams to work hand in hand, ensuring a smooth customer experience.”

 

Personalisation is key

This initiative marks a significant step towards a more customer-friendly experience within the telecommunications industry, a vision shared by Virgin Media. Chris Huggins, Conversational Commerce Lead at Virgin Media, stresses the importance of innovative customer engagement:

“Our focus is to utilise conversational AI and asynchronous chat to provide convenient, efficient, and personalised support to help reassure customers switching providers online.”

 

A Step towards the future

In conclusion, the launch of this AI chatbot marks a significant milestone in the telecommunications industry. It serves as a testament to the power of AI and the benefits it can bring in redefining customer experiences.

C-Centric’s pioneering initiative, supported by Stickee and Virgin Media, offers a glimpse into the future of customer service, a future where technology and human expertise come together to create seamless, streamlined experiences for customers. This is not only a win for customers but also for the telecommunications industry, setting the stage for further innovation and growth.

 

About C-Centric

C-Centric is a data AI company based in London, committed to reengineering customer experiences through application integration, data, and AI.

For Media & Partnership enquiries:

About Stickee

Stickee is a comparison tech company specialising in supporting major broadband comparison sites, including Go Compare. They utilise their technical expertise to continually enhance customer experiences within the comparison site industry.

Share

Our solutions …

Fresh data strategies for accelerated customer acquisition

Revolutionary data insights to reduce customer churn and cross-sell more

Expert advice and solutions to guide you through the compliance minefield

]]>
Exclusive Networking Dinner: With guest speaker Nick Dymott from Manchester United https://www.ccentric.co.uk/shifting-mindsets-with-ai-and-data-driven-success/ Fri, 25 Aug 2023 11:23:52 +0000 https://www.ccentric.co.uk/?p=3368

C-Centric Hosts:

Exclusive Networking Dinner: Shifting Mindsets with AI and Data Driven Success

Guest speakers: Nick Dymott, Head of CRM, Manchester United FC , Michael Page, Director C-Centric and Neil Joyce, CEO and Co-Founder of The CLV Group

Event summary:   C-Centric brought together senior executives to share knowledge about innovations and challenges at the sharp end of customer engagement. Over a three-course dinner, senior leaders in data, digital & CRM from a broad cross-section of industry sectors – Premier League, telcos, energy, media, ticketing charities and financial services engaged in interactive discussions led by panel Q&A and guest speakers.

Data Partnership Enablement to Identify, Engage, Grow & Monetise a Fanbase

Guest speaker: Nick Dymott from Manchester United was interviewed by Neil Joyce (CLV group). Nick highlighted a number of key challenges :-  bringing large volumes of digital behavioural data from Adobe platform into the cloud data lake house.  the use of data clean rooms to increase fan engagement and acquiring global fans and the exponential demand on his team from stakeholders across the business.

The key insights from the session:

The two significant priorities and challenges for Nick were:

  1. “Activate the fans we do know  – and  to get as much engagement out of that group as we can.
  2. Grow our known fan base – get much closer to addressing the club’s 1.1 billion fans as possible, finding the right mechanisms and methods to do that in particular markets.”

The role of data is evolving as a critical lever in Manchester United’s partnerships with ROI driven sponsors who require the data and insights to support these sponsorship decisions.

Sport is one of the most extreme verticals, where we have millions of fans, but we are looking to augment our data with robust data and insights that go beyond the limitations of shares, views and likes, along with the panel-based Nielsen ratings data”.

Football Clubs do not have large 1P customer databases. Their sources of data capture include Ticketing, Ecommerce, Web or App registrations and TV Channel data.  Similar to many other organisations, the club needs to address the data gaps to understand and activate its fanbase. To do this Nick acknowledges that partnerships with brands that are open to sharing data are essential.

Data cleanrooms are critical enablers of safe, secure data collaboration with affinity partners & publishers – whilst many organisations are open to sharing insights, they will not release their data outside of their secure environments or expose any of their audiences PI as part of a data sharing process.

Question: But how do you bring all this together and scale it?

C-Centric working in partnership with the CLV Group can link and connect audiences through data collaboration platforms & Data clean rooms.  This enables clubs like Manchester United to reach, convert and monetise these unknown fan bases.  C-Centric provide the data engineering, identity stitching a variety of data clean room technologies to help organisations collaborate while  ensuring the non-movement of data and preventing any exposure of PI whilst enabling the sharing of powerful actionable insights. Both AWS Clean rooms, Infosum, Big  Query clean rooms and Snowflake secure data share provide good technology options  for CRM teams.

 

“The emergence of data clean rooms has enabled brands who were previously averse to sharing their data to securely open up and explore the potential overlap with media owners”. One media owner enabled 6m Manchester United fans based in the US to be identified generating insights on their propensity to buy and the offers that appeal to them.

Leveraging different data partners and different data cleanrooms allows organisations, no matter their industry, to collaborate to provide a much more robust data set to acquire new audiences.

AI Implementation Innovations

AI became the focus for the second half of the evening, as Michael Page from C-Centric presented 3 real life examples of AI implementation within CRM & opened the discussion to attendees to share their experiences.

Here are the key insights from the session.

  1. Demonstration of how PrediCX agentic AI  enables high quality customer conversations and switching service within the Go Compare comparison site

 

  • Description of PrediCX anti hallucination functionality.
  • How to combine advanced RAG techniques to achieve answer accuracy
  • The importance of content maintenance pipelines to ensure ongoing accuracy of answers over time.
  • How to enable “messaging to messaging transfer” between publishers and advertisers domains and across heterogenous messaging & chat platforms. All whilst maintaining conversation context and affiliate tracking links.;

 

Messaging to messaging transfer

Research show 74% of consumers want to use messaging as part of product enquiry and buying process – this AI BOT to messaging bridge enables seamless messaging from display ad/content  within publisher to advertiser sales team.

 

  1. Dynamic emails with embedded GPT
  • This is the ability to interact within the body of AMP formatted emails, using the PrediCX GPT AMP engine.
  • Users can interact within the body of an email, ask multiple questions, and receive a response all within the body of the email. This can reduce inbound agent voice calls in CS and generate additional sales in remarketing/ cross sell emails. Michael highlighted how early trials have been driven in telco from Adobe Campaign for upgrade retention campaigns.

 

“I’ve tried dynamic emails but never with generative AI within it, we would ask one survey question, but to take it to a generative AI level all within an email, well that’s impressive”.  – Attendee from a multinational electric utility company.

  1. AI for insight & MI

Michael then talked through how to build semantic data layer to enable AI to support natural language data queries. He showed how Ecommerce sites and marketplaces can use this to offer data discovery and smart filters of product catalogues

If you’re interested in understanding more about how C-Centric help you implement AI into your CRM and CS activities, please contact mark.hurd-bennett@ccentric.co.uk

Share
]]>
C-Centric completes the acquisition of Warwick Analytics https://www.ccentric.co.uk/c-centric-completes-the-acquisition-of-warwick-analytics/ Mon, 20 Dec 2021 09:59:53 +0000 https://www.ccentric.co.uk/?p=3150

C-Centric completes the acquisition of Warwick Analytics

C-Centric has completed the acquisition of Warwick Analytics, a Machine Learning AI company. Warwick Analytics was originally a Warwick University AI spinout and the data science team have developed proprietary natural language algorithms developed over a decade of academic research.

PrediCX, the flagship product, which works on multiple platforms including Zendesk and Salesforce, automates and prioritises workflow with tags by topic, sentiment, intent and urgency. PrediCX uses Artificial Intelligence and Sentiment Analysis to review and interpret the customer conversation. Identifying what is really being said enables fast tracking of complaints, issues and urgent enquiries to improve the customer experience.

Says Michael Page, Director at C-Centric, “Warwick’s consultants have led major projects to apply Conversation AI within service operations at large organisations to deliver increased levels of self serve, agent productivity and insight. We will be embedding PrediCX across our customer management solutions and accelerating the roll out of 3rd party integrations.”

To discuss AI and PrediCX or to explore the next generation in customer service, contact C-Centric by telephone on 0203 130 4764 or email on PrediCXinfo@ccentric.co.uk

Share
]]>
Adobe Experience User Group Dinner – 30th July 2019 https://www.ccentric.co.uk/adobe-experience-user-networking-group/ Wed, 06 Nov 2019 12:24:37 +0000 https://www.ccentric.co.uk/?p=3091

Adobe Experience User Group – Event Summary

Date : Tuesday 30th July 2019 – start at 18:00

Guest speaker: Adam Ingram, Senior Marketing Manager, The Telegraph

Venue: The River Cafe, Thames Wharf, Rainville Rd, London W6 9HA. https://rivercafe.co.uk/.

 

The group received an excellent presentation by Adam Ingram, Senior Performance Manager at the Telegraph.

 

Adam explained how he & his team are transforming their approach to segmentation and have developed an audience-first approach.  One key message was that your audience strategy and your segment design in AAM can either amplify or restrict what you can achieve with Adobe Experience Cloud.

 

Adam described how the Telegraph operationalised the monitoring of habits and engagement levels across the customer base, beyond the first layer. Performance of audiences were evaluated against business KPI’s as opposed to the traditional approach of analyzing individual marketing channels. Armed with this insight, they are personalising the most common and effective journeys to a subscribing relationship. This is resulting in increased customer engagement by improving ads that are relevant to their stage of relationship with the brand, that increases traffic, lead gen and conversion.

 

Adam’s presentation stimulated wide ranging round table discussion from attendees.

 

Some of the key takeaways and observations from the group:

  • The impressive business results that can be achieved by combining a high volume email registration program with outbound comms driven by Web behavioural audiences created in AAM.
  • The benefits of activating Web behavioural data into other channels (email/SMS & contact centre)
  • Going beyond segment level to trait level data by processing CDF data files.
  • The benefits of bringing together a cross disciplinary team to segment design workshops
  • The need to look at segment design through multiple dimensions – mission, profile, content & journey context.
  • The wide-ranging DSP match rates experienced across the group
  • The need for often multiple identify graphs to get to a true 360 visitor picture and improve targeting.
  • The strategies required as we move to a potentially 3rdparty cookie- less world.

 

If you are interested in attending the next C-Centric Adobe Experience User Group, please drop an email to Ola Olaniawo @ ola.olaniawo@ccentric.co.uk who heads up the C-Centric Adobe consulting practice

About Adobe Experience User Group?

A London based user group of Adobe Experience Cloud users and owners set up to facilitate sharing of knowledge and learnings in a relaxed environment. These learnings can be both commercial and technical and our members are from both execution and management.

 

About Ola Olaniawo – Group Co-ordinator

I head up C-Centric’s Adobe consulting practise and have over 6 years leading large Adobe and Experience Personalisation solutions across telcos/media, retail and financial services at Accenture, Vodafone Group, HSBC, RBS, PwC. This also includes over 15 years of leading digital Customer Experience Solution implementations across various industries. https://www.ccentric.co.uk/

30/07/19 Adobe Experience User Group Dinner Agenda

18.00-18.30: Meet & drinks @ The River Café garden overlooking the river

18.45: Sit down in event room and introduction by Ola

19.00: Three Course Meal

Talk by Adam Ingram, Senior Marketing Manager, The Telegraph

Adam will speak on how they refined a customer-centric approach across their core channels while building new ways to engage and build revenue driving relationships with their customers

Round up by Ola and Andrew Coe on real world Personalisation best practices

Live implementation learnings on experience personalisation strategies from a telco and financial services perspective.

Questions for the Panel.

21.00: Event ends

Share

Contact us for more details:
ola.olaniawo@ccentric.co.uk

Our solutions …

Fresh data strategies for accelerated customer acquisition

Revolutionary data insights to reduce customer churn and cross-sell more

Expert advice and solutions to guide you through the compliance minefield

]]>
Webinar – Outbound compliance https://www.ccentric.co.uk/gdpr-webinar-20-11-17/ Wed, 19 Jun 2019 09:44:30 +0000 https://www.ccentric.co.uk/?p=2917

How to keep your outbound operations compliant

Neil Devadason

Head of Data Analytics & Data Strategy

Suella Sharp

Contact Centre Performance &Transformation Consultant

How to keep your outbound operations compliant

In this Webinar we examine the impact of GDPR and the latest 2017 OFCOM regulation changes on outbound contact centre operations.

Topics

  • Impact of changes
  • Practical action plan

 Sign up for updates

Please email me your whitepaper on GDPR – How to keep your outbound operations compliant:


]]>