LogoLogo
Explore with DeepWikiJoin Slack CommunityContact Us
  • About Hyperswitch
    • Introduction
    • Payments Suite
    • Payments Modules
      • Cost Observability
      • Revenue Recovery
      • Vault
        • Server to Server Vault tokenization
        • Vault SDK Integration
      • Intelligent Routing
      • Reconciliation
        • Getting Started with Recon
      • Alternate Payment Method Widgets
        • Hyperwidget Integration Guide
    • Roadmap - Q2 2025
      • Previous Roadmap - Q1 2025
      • Previous Roadmap - Q4 2024
      • Previous Roadmap - Q3 2024
      • Previous Roadmap - Q2 2024
      • Previous roadmap - Q1 2024
      • Previous roadmap - Q4 2023
  • Use-Cases
    • For SaaS Businesses
    • For B2B SaaS Businesses
    • For E-Commerce Businesses
    • For Marketplace/Platforms
  • Explore Hyperswitch
    • Payment Orchestration
      • Accept Payments
        • Connectors
          • Activate Connector on Hyperswitch
          • Try a Payment
          • Available Connectors
            • ACI
            • Adyen
            • Airwallex
            • Authorizedotnet
            • Bambora
            • Bank of America
            • Billwerk
            • Bluesnap
            • Braintree
            • Checkout
            • Coinbase
            • Cybersource
              • Apple Pay
              • Google Pay
            • dLocal
            • Fiserv
            • GlobalPayments
            • GoCardless
            • Klarna
            • Mollie
            • MultiSafepay
            • Nuvei
            • OpenNode
            • Paypal
            • PayU
            • Prophetpay
            • Rapyd
            • Shift4
            • Stripe
            • TrustPay
            • Volt
            • Worldline
            • Worldpay
            • Zen
            • Netcetera
              • Authenticating Payments via Netcetera Through HyperSwitch SDK
        • Setup Payment Methods
          • Cards
          • Wallets
            • Apple Pay
              • Web Domain
              • iOS Application
            • Google Pay
            • PayPal
          • Pay Later
          • Banks
            • Bank Debits
            • Bank Redirects
            • Bank Transfers
            • Open Banking
          • Crypto
          • Test Credentials
        • Payment Links
          • Configurations
          • Create Payment Links
          • Secure Payment Links
          • Setup Custom Domain
        • Save a Payment Method
        • Manual Capture
        • Incremental Authorization
        • Tokenization & Card Vault
          • Network Tokenisation
        • Supported Payment Workflows
        • Co-badged Cards
        • Webhooks
      • Process Payouts
        • Getting Started with Payouts
        • Using Saved Payment Methods
        • Smart Router for Payouts
        • Smart Retries in Payout
        • Payout Links
      • Smart Routing
        • Rule Based Routing
        • Volume Based Routing
        • Default Fallback Routing
      • Smart Retries
        • 3DS Step-up Retries
      • 3DS / Strong Customer Authentication
        • Setting up 3DS Decision Manager
        • Native 3DS Authentication
        • External Authentication for 3DS
      • Fraud & Risk Management
        • Activating FRM in Hyperswitch
        • Fraud Blocklist
      • Subscriptions
        • PG Agnostic Card Forwarding
        • Zero Amount Authorization
      • Split Payments
        • Stripe Split Payments
        • Adyen Split Payments
        • Xendit Split Payments
    • Checkout Experience
      • Customizable and Native Integrations
        • Web
          • Node And React
          • Customization
          • Error Codes
          • Node and HTML
          • Vanilla JS and REST API Integration
        • Android
          • Kotlin with Node Backend
          • Customization
          • Features
        • iOS
          • Swift with Node Backend
          • Customization
          • Features
        • React Native
          • React Native with Node Backend
          • Card Widget
          • Customization
        • Flutter
          • Flutter with Node Backend
          • Customization
        • Headless SDK
        • Server Setup
      • Click To Pay
        • Visa Click to Pay: V1 to V2 Migration
      • Payment Methods Management
    • Payment Operations
      • Managing Accounts and Profiles
        • ⚙️Control Centre Account setup
        • Hyperswitch Account Structure
      • Manage Your Team
      • Analytics & operations
        • Exporting payments data
      • Disputes / Chargebacks
      • Surcharge
        • Surcharge Setup guide
      • Multi-Tenancy
      • Data migration
        • Import data to Hyperswitch
        • Export data from Hyperswitch
    • Security and Compliance
      • PCI Compliance
      • Data Security
      • GDPR compliance
      • Identity and Access Management
    • E-commerce Platform Plugins by Hyperswitch
      • 🔌WooCommerce Plugin
        • Setup
        • Roadmap
        • Compatibility
        • FAQs
      • Saleor App
        • Setup
      • Automatic Tax calculation for Express Checkout wallets
  • Hyperswitch open source
    • Overview
      • Run Hyperswitch Locally Using Docker
        • Run Additional Services
      • Development Environment Setup
        • Backend
          • Configure and Run the Application
          • Try out APIs
        • SDK (Frontend)
        • Control Center
    • Deploy on AWS
      • Deploy on AWS using CloudFormation
      • Component-wise Deployment
        • Deploy app server
        • Deploy Control Center
        • Deploy web client
          • Production ready deployment
          • Integrate web client on your web app
          • Playground deployment for prototyping (optional)
        • Deploy Card Vault
          • Production ready deployment on AWS
          • Cloud setup guide
    • Deploy on Kubernetes
      • Deploy on GCP Using Helm Charts
      • Deploy on Azure Using Helm Charts
    • Exploration Guide
    • Account setup
      • Using Hyperswitch Control Center
      • Test a payment
      • Using postman
    • Troubleshooting
  • Testing Payments
  • Check list for Production
    • Going live
      • For SaaS Setup
      • For On-Prem Setup
        • Monitoring
        • PCI compliance
          • Get started
          • Completing the SAQ
        • Data Security
        • Updates
  • Learn more
    • API Reference
    • Connectors Supported
    • SDK Reference
      • React
      • JS
      • Custom Events
    • Hyperswitch architecture
      • Router
      • Storage
      • A Payments Switch with virtually zero overhead
    • Payment flows
    • Blog
  • Community Guidelines
Powered by GitBook

Compliance

  • Vulnerability Disclosure
  • PCI DSS 4.0
  • ISO 27001:2022

Community

  • Slack
  • Discord
  • GitHub Discussion
On this page
  • Architecture
  • Integration steps
  • File format and paths specifications
  • Data updation frequency & retention:
  • Auto ingestion using Redshift
  • Table creation/schema
  • Ingesting data from S3

Was this helpful?

  1. Explore Hyperswitch
  2. Payment Operations
  3. Analytics & operations

Exporting payments data

Export your payments data to Redshift from Hyperswitch

Last updated 3 months ago

Was this helpful?

Exporting your payments data to Amazon Redshift enhances analytics by leveraging Redshift's high-performance query capabilities. This allows for efficient data analysis, reporting, and business intelligence there by deriving valuable insights

Architecture

Integration steps

  1. You are required to create a new IAM role for Redshift use and provide Hyperswitch with the corresponding role ARN. This IAM role must be configured with S3 read permissions.

  2. After sharing the ARN with Hyperswitch, We will share the S3 bucket & path that is to be synced for data along with providing access to the IAM role from where you will be able to get files from the S3

  3. Post which you can proceed with the below

    (OR)

File format and paths specifications

  1. The files will be plain csv files with 1st row being a header

  2. The file path would look be s3://<bucket>/<merchant_id>/<version>/<payments>/<date>.csv

  3. There will be one csv file corresponding to each day upto 7 days. Updates to the payments data will be in-place

  4. Changes to file formats, content or likewise would change the version in the above path and would be communicated.

Data updation frequency & retention:

Data update schedule

6 hours frequency up to 7 days

Data retention on S3 folder

7 days

Type of data exposed

Data storage location

us-east-1

Auto ingestion using Redshift

  1. Redshift supports ingesting csv data directly from S3 files which we’ll rely on.

  2. The ingestion to redshift would happen via a COPY job, this can be automated via the following options

Table creation/schema

CREATE TABLE payments (
  payment_id VARCHAR(64),
  attempt_id VARCHAR(64),
  status TEXT,
  amount INTEGER,
  currency VARCHAR(10),
  amount_to_capture INTEGER,
  customer_id VARCHAR(64),
  created_at TIMESTAMP,
  order_details VARCHAR(255),
  connector VARCHAR(255),
  error_message VARCHAR(255),
  connector_transaction_id VARCHAR(255),
  capture_method VARCHAR(255),
  authentication_type VARCHAR(255),
  mandate_id VARCHAR(64),
  payment_method VARCHAR(255),
  payment_method_type TEXT,
  metadata TEXT,
  setup_future_usage TEXT,
  statement_descriptor_name TEXT,
  description TEXT,
  off_session TEXT,
  business_country TEXT,
  business_label TEXT,
  business_sub_label TEXT,
  allowed_payment_method_types TEXT
);

Ingesting data from S3

create temp table payments_stage (like payments);

copy payments_stage from 's3://<BUCKET_NAME>/<MERCHANT_ID>/<VERSION>/payments' 
credentials 'aws_iam_role=<ARN_ROLE>'
IGNOREHEADER 1
timeformat 'YYYY-MM-DD HH:MI:SS'
csv;

MERGE INTO payments USING payments_stage ON payments.payment_id = payments_stage.payment_id
WHEN MATCHED THEN UPDATE SET
payment_id = payments_stage.payment_id,
attempt_id = payments_stage.attempt_id,
status = payments_stage.status,
amount = payments_stage.amount,
currency = payments_stage.currency,
amount_to_capture = payments_stage.amount_to_capture,
customer_id = payments_stage.customer_id,
created_at = payments_stage.created_at,
order_details = payments_stage.order_details,
connector = payments_stage.connector,
error_message = payments_stage.error_message,
connector_transaction_id = payments_stage.connector_transaction_id,
capture_method = payments_stage.capture_method,
authentication_type = payments_stage.authentication_type,
mandate_id = payments_stage.mandate_id,
payment_method = payments_stage.payment_method,
payment_method_type = payments_stage.payment_method_type,
metadata = payments_stage.metadata,
setup_future_usage = payments_stage.setup_future_usage,
statement_descriptor_name = payments_stage.statement_descriptor_name,
description = payments_stage.description,
off_session = payments_stage.off_session,
business_country = payments_stage.business_country,
business_label = payments_stage.business_label,
business_sub_label = payments_stage.business_sub_label,
allowed_payment_method_types = payments_stage.allowed_payment_method_types
WHEN NOT MATCHED THEN INSERT VALUES (
payments_stage.payment_id,
payments_stage.attempt_id,
payments_stage.status,
payments_stage.amount,
payments_stage.currency,
payments_stage.amount_to_capture,
payments_stage.customer_id,
payments_stage.created_at,
payments_stage.order_details,
payments_stage.connector,
payments_stage.error_message,
payments_stage.connector_transaction_id,
payments_stage.capture_method,
payments_stage.authentication_type,
payments_stage.mandate_id,
payments_stage.payment_method,
payments_stage.payment_method_type,
payments_stage.metadata,
payments_stage.setup_future_usage,
payments_stage.statement_descriptor_name,
payments_stage.description,
payments_stage.off_session,
payments_stage.business_country,
payments_stage.business_label,
payments_stage.business_sub_label,
payments_stage.allowed_payment_method_types
);

drop table payments_stage;

The above query creates a temporary table to load all the csv data & then merges this with the main table while deduplicating based on payment_id

Prerequisite: You need to have an AWS account with redshift enabled. More details can be found here

Example image of an IAM role created

Once the above step is done, you need to

using scripts

payments as per

You can use the if running a preview cluster

Or the more mainstream

https://aws.amazon.com/redshift/
create the table schema on Redshift
Handle the ingestion & post processing of data
Auto-ingestion using Redshift
auto copy job
lambda loader
schema