MarketCheck Data Feeds can be delivered through various platforms and protocols to accommodate different infrastructure setups and security requirements. Delivery options fall into two main categories: MarketCheck-owned infrastructure where clients pull data, and client-owned infrastructure where MarketCheck pushes data.
Clients pull data from MarketCheck-managed locations. MarketCheck provides credentials and handles infrastructure maintenance.
Client responsibilities: Use provided credentials and download files within the retention window - no cleanup required.
Platform | Webhook Support | File Size Limit |
---|---|---|
FTP | ✅ | 5 TB per file |
SFTP | ✅ | 5 TB per file |
Google Cloud Storage | ✅ | 5 TB per file |
MarketCheck pushes data directly to client-managed locations based on delivery frequency.
Client responsibilities: Provide access credentials and permissions, manage storage costs and retention policies, handle cleanup of old files.
Platform | Webhook Support | File Size Limit |
---|---|---|
AWS S3 | ✅ | 5 TB per file |
Azure Blob Storage | ✅ | 4.75 TB per file |
Google Cloud Storage | ✅ | 5 TB per file |
SFTP | ✅ | Server dependent |
FTP | ✅ | Server dependent |
Dropbox | ❌ | 350 GB per file |
Google Drive | ❌ | 15 GB per file |
Credentials Setup:
airflow-auth@marketcheck-gcp.iam.gserviceaccount.com
Required Permissions:
Permission | Description |
---|---|
roles/storage.objectUser | Basic object access role |
storage.objects.create | Upload files to bucket |
storage.objects.get | Read file content |
storage.objects.delete | Delete files when needed |
Required Credentials:
bucket-owner-full-control
)Required Permissions:
Permission | Description |
---|---|
s3:PutObject | Upload files to bucket |
s3:GetObject | Read file content |
s3:HeadObject | Retrieve file metadata (required to verify complete successful upload) |
s3:DeleteObject | Delete files when needed |
s3:ListBucket | List files in the bucket |
s3:AbortMultipartUpload | Abort incomplete multipart uploads |
s3:ListMultipartUploadParts | List parts of a multipart upload |
s3:ListBucketMultipartUploads | List all multipart uploads in the bucket |
Documentation References:
Required Credentials:
SAS Token Permissions:
Permission | Description |
---|---|
c | Create (upload) blobs |
r | Read blobs |
w | Write blobs |
l | List blobs |
d | Delete blobs |
RBAC Alternative:
Storage Blob Data Contributor
role instead of SAS tokenDocumentation Reference:
Required Credentials:
Credential | Description |
---|---|
Host | SFTP server domain/IP |
Username | SFTP username |
Authentication | Password OR Private Key file (with optional passphrase) |
Filesystem Permissions:
Permission | Description |
---|---|
Read | Check file size and properties |
Write | Upload files |
Delete | Remove files when needed |
Execute | Traverse and list directories |
Required Credentials:
Credential | Description |
---|---|
Host | FTP server domain/IP |
Username | FTP username |
Password | FTP password |
Required Permissions:
Permission | Description |
---|---|
Write/Upload | Store files on server |
Read | Check file properties and size |
Delete | Remove files when needed |
Setup Process:
Required Credentials:
Required App Permissions:
files.content.write
- Upload and delete filesfiles.metadata.read
- Read file metadataaccount_info.read
- Validate account accessSetup Process:
Required Permissions:
drive.file
or drive.content
permissions via folder sharingFiles follow a standardized naming pattern with feed prefix and date suffix:
feedname_YYYYMMDD.csv.gz
MarketCheck-Owned Infrastructure:
Client-Owned Infrastructure:
MarketCheck provides two notification methods to confirm successful file delivery, that can be used independently or in combination based on your integration requirements to trigger downstream processes.
Webhooks are HTTP callbacks that MarketCheck sends to your specified endpoint immediately after successful file delivery. You need to implement a webhook endpoint on your server to receive these real-time notifications, which can then trigger your downstream data processing workflows.
Basic Webhook Structure:
{
"status": "SUCCESS",
"file_name": "mc_us_used_20250721.csv.gz",
"file_size": "12345678",
"directory": "[platform-specific-path]"
}
Platform-Specific Directory Fields:
Platform | Directory Field Format | Example |
---|---|---|
FTP/SFTP | "directory": "/path/to/files" | "/marketcheck/daily" |
Azure Blob | "directory": "container-name" | "/marketcheck/daily" |
GCS | "directory": "/path" | "directory": "/daily" |
AWS S3 | "directory": "s3://bucket/path" | "s3://marketcheck/daily" |
GCS
bucket
field is included in payloadbucket
field specifies the GCS bucket name where the file is stored and directory is the path within that bucket{
"status": "SUCCESS",
"file_name": "mc_us_used_20250721.csv.gz",
"file_size": "12345678",
"directory": "/daily",
"bucket": "marketcheck"
}
Authentication:
X-API-Key
header for webhook endpoint securityExample Webhook Request:
curl -X POST https://mc-hooks.awesomecars.com/ack \
-H "Content-Type: application/json" \
-H "X-API-Key: your-secret-key" \
-d '{
"status": "SUCCESS",
"file_name": "mc_us_used_20250721.csv.gz",
"directory": "s3://marketcheck/daily",
"file_size": "12345678"
}'
Optional file-based notifications delivered alongside or instead of webhooks:
Success Notification:
mc_us_used_20250721.success.txt
Failure Notification:
mc_us_used_20250721.failure.txt