What\'s in the download
Each comment on the post becomes one row in your Excel/CSV (or one object in JSON). The fields:
- username — the @-handle of the commenter (no display name — those change too often to be useful).
- comment — the comment text, with @mentions preserved exactly as written.
- created_at_utc — ISO 8601 UTC timestamp.
- created_at_local — same timestamp shifted to your browser's local timezone (helpful for reporting).
- likes — current like count on the comment.
- reply_to — the @-handle this is a reply to (empty for top-level comments). Lets you reconstruct threads in Excel with a single VLOOKUP.
- permalink — direct URL to that comment on Instagram.
Format guide
Excel (.xlsx)
Best for human review and one-off reports. The downloader formats timestamp columns as Excel dates, freezes the header row, and adds an auto-filter on every column. Open it and you can sort by likes / username / date instantly.
CSV (.csv)
Best for ETL / database import. UTF-8 BOM-prefixed so Excel-on-Windows opens it correctly. Newlines in comments are escaped per RFC 4180. Use this format if you're piping the data into Tableau, Power BI, or your data warehouse.
JSON (.json)
Best for code / API consumers. Preserves the nested reply structure as { "comment": {...}, "replies": [{...}, {...}] } so you don't need to do parent-lookups yourself.
Common workflows
- Giveaway / contest entry list — download the comments under your contest post, then drop the file into our random comment picker to draw a winner.
- Sentiment analysis — paste the CSV into ChatGPT or Anthropic Claude with a prompt like "Score each row 1-5 for positive sentiment".
- Influencer report — count comments per @user with a pivot table to find your most engaged followers.
- FTC compliance archive — store the export alongside the sponsored-post screenshot for the 2-year FTC retention requirement.
Bulk and scheduled exports
Need to download comments from 50 posts at once, or every new post automatically? Our paid plans include bulk URL upload (one CSV with N URLs → one ZIP with N files, one per URL) and scheduled exports (cron-style: "every Monday at 9am, export new comments on @mybrand"). Both deliver to S3, Google Drive, or webhook.