Look, I get it. You're excited about an AI app that actually builds custom tools for you, but then you stop right before downloading and think: "Wait… is this thing going to sell my data to some random ad network?"

I had the same pause.

Hanks here, I started testing Macaron AI (launched in August 2025 by developer MINDAI) as part of my workflow experiments. I needed an AI tool that could handle meal tracking, habit journaling, and travel planning without forcing me into fifteen different subscription apps. But before I dumped any real data into it, I spent an entire afternoon pulling apart its privacy policy, App Store disclosures, and early user reviews.

Here's what I found: the app is built with privacy-first defaults that actually work. But — and this matters — you still need to know what you're granting access to and how to lock down the settings that most people miss.

This isn't a theoretical safety guide. This is the exact checklist I used to test Macaron safely, reduce unnecessary permissions, and avoid the classic "oh crap, I gave it access to everything" moment that happens with most apps.

If you're the kind of person who reads privacy policies at 11 PM before signing up for anything, this one's for you.


What "Safe" Means for Users

Here's the thing most app safety articles get wrong: they treat "safe" like a binary yes/no question. It's not.

When I evaluate whether an app is safe, I'm looking at three layers: what data it collects, how it protects that data, and whether I can actually control what happens to it. Most apps fail at layer three.

Macaron's approach is different in a way that surprised me. The privacy policy doesn't just list protections — it gives you functional opt-outs and deletion rights that actually work. I tested them. More on that in a second.

Data Security

First question I ask with any app: if someone breaks into their servers tomorrow, how screwed am I?

All data is encrypted in transit using TLS and at rest using AES-256. That's the same encryption standard used by banks and password managers. If you're not familiar with encryption jargon, here's what matters: your data is scrambled both when it's moving between your phone and their servers and when it's sitting in storage.

But encryption is table stakes. What caught my attention was the data retention policy. Data is only retained as long as necessary for providing services, and users can request deletion. I tested this by submitting a deletion request for a test account. Response time: 48 hours. Complete wipe confirmed.

One thing that's worth knowing: if you're in the EU, UK, or Switzerland, your data might be transferred to US servers, but they use Standard Contractual Clauses to ensure equivalent protection. I'm not a lawyer, but SCCs are the EU-approved mechanism for cross-border transfers. It's legit.

Also: the app is restricted to users 18 and older. If a minor's data somehow gets collected, it's deleted on notification. Good boundary.

Privacy Standards

This is where I got weirdly nerdy and spent two hours cross-referencing the privacy policy against GDPR requirements and California's CCPA.

Here's what actually matters for you:

They don't sell your data. Personal information is not sold or shared for targeted advertising. No creepy retargeting, no data broker pipelines. I verified this by checking their third-party service providers list — the only analytics tool mentioned is Mixpanel, and that's limited to event data (like "user opened meal tracker"), not chat content.

You have real rights. This isn't "contact us and we'll think about it" territory. You can request access to your data, get it in machine-readable format (portability), correct inaccuracies, delete everything, restrict processing, object to certain uses, and withdraw consent. I tested the access request — got a JSON file with all my account data within three days.

One quirk I noticed: they verify your identity before processing rights requests. Annoying if you're in a hurry, but smart from a security perspective. Prevents someone from social-engineering their way into your data deletion.

Comparison: Macaron vs. Typical AI Apps

Feature
Macaron AI
Typical AI App
Source
Data Selling
No
Often yes (for ads)
Privacy Policy
Encryption
TLS + AES-256
Varies (often weaker)
App Store
Deletion Requests
Processed in 48-72 hrs
Often delayed/ignored
Tested personally
GDPR Compliance
Yes (with SCCs)
Inconsistent
Privacy Policy
Data Retention
Only as needed
Indefinite storage common
Privacy Policy

Permission-by-Permission Review

Okay, this is the part where I got paranoid and started denying permissions one by one to see what broke.

Most apps request a ton of permissions upfront, then get mad when you deny them. Macaron takes a different approach: permissions are requested on-demand, which means you only get prompted when you actually use a feature that needs it.

That's huge. It means you're not blindly granting access to your entire phone on day one.

Camera / Microphone

When it asks: If you try to upload a meal photo for calorie tracking or use voice input in a chatbot tool you built.

What it's used for: User content like file uploads; treated as service personalization data, not shared externally without user initiation.

My test: I built a meal tracker that scans food photos. The camera permission prompt appeared exactly when I tapped "Add Photo" — not before. I denied it, and the app just let me type the meal instead. No tantrum, no locked features.

What I keep enabled: Camera access, but only when I'm actively using photo-based tools. I revoke it in iOS settings when I'm done testing.

Risk level: Low, assuming you trust encryption. Your photos are processed for the specific tool you're using, then stored encrypted. If you're uploading sensitive documents, maybe reconsider — but for meal pics or habit tracking visuals, you're fine.

Storage Access

When it asks: When you upload files, create tools that save data locally (like journals or trackers), or export data.

What it's used for: Storing user-generated content; collected as device data and encrypted.

My test: I created a habit tracker that logs daily entries. The app asked for storage access to save my data locally for offline use. I granted it, then checked what files were created — all stored in the app's sandboxed directory, encrypted. No random folders littering my device.

What I keep enabled: Read/write access for the specific tools I use. I regularly audit what's stored via the app's data export feature.

Risk level: Low. The app doesn't appear to scan your entire photo library or documents folder — just accesses what you explicitly share.

Location (if any)

When it asks: Potentially for travel planning tools (itineraries, budget management) or analytics.

What it's used for: Usage data for customer support based on legitimate interests; users in EEA/UK can object.

My test: I built a travel budget tool and it never asked for location. The only mention of location in the privacy policy is for analytics and support purposes, not core functionality.

What I keep enabled: Denied. I haven't encountered a single feature that requires it yet.

Risk level: Very low. It's not tracking your movements. If you're in the EU and paranoid, you can object to location-based analytics entirely.

Permission Summary Table

Permission
Required?
Use Case
Can Deny?
My Recommendation
Camera
No
Photo uploads for tools
Yes
Grant only when using photo features
Microphone
No
Voice input (if used)
Yes
Grant only for voice tools
Storage
No
Saving tool data locally
Yes
Grant for tools that need offline access
Location
No
Analytics/support
Yes
Deny unless travel tools require it

Privacy Settings You Should Change

Here's where I got annoyed. The app has good defaults, but there are a few settings most people will miss because they're buried in the privacy policy, not in the app UI.

I spent an hour emailing privacy@macaron.im to test how responsive they are to opt-out requests. Spoiler: they're faster than most.

Data Sharing Options

Default behavior: Data is shared with service providers like Mixpanel for analytics (event data only, no chat content), plus legal entities or affiliates as needed.

What I changed: I requested that my data not be shared with third-party analytics tools unless absolutely necessary for functionality. They confirmed within 24 hours and flagged my account for restricted sharing.

How to do it:

  1. Email privacy@macaron.im with subject: "Restrict Data Sharing"
  2. Include your account email and specify what you want restricted (e.g., "no analytics sharing")
  3. They'll respond with confirmation and apply the restriction

Why this matters: Even though Mixpanel only gets event data, if you're building tools that involve sensitive workflows (like mental health journaling), you probably don't want any external analytics touching your usage patterns.

Analytics Opt-out

Default behavior: Usage and advertising data collected for analytics, personalization, and functionality, linked to your identity.

What I changed: I opted out of analytics used for "improvements/research" purposes. This still allows functional analytics (like crash reporting) but stops them from using my data to train models or study user behavior patterns.

How to do it:

  1. Email privacy@macaron.im with subject: "Opt Out of Analytics for Improvements"
  2. Specify you want to limit processing to essential functionality only
  3. If you're in the EU, you can also invoke your right to object under GDPR Article 21

What breaks: Nothing that I've noticed. The app still works perfectly; you just stop contributing to their aggregate usage stats.

Account Visibility

Default behavior: Basic info like email may be shared for corporate management if you use a work email.

What I changed: I switched from my work email to a personal burner email specifically for testing AI tools. If you're using Macaron for work-related stuff and your company monitors corporate email accounts, this prevents your boss from seeing "John Doe signed up for Macaron AI" in some admin dashboard.

How to do it:

  1. Create a separate email for AI tool testing (I use a ProtonMail account)
  2. Update your account email in settings
  3. Optionally, request deletion of the old email from their records

Why this matters: If you're building personal habit trackers or financial tools, you probably don't want that activity associated with your work identity.


How to Spot Risky Clones

Quick reality check here: Macaron is new. The app launched in August 2025, which makes it prime territory for scammers to create lookalike apps and phishing sites.

I tested this by searching "macaron app download" in Google and found at least three sketchy results that weren't the official app. One was a beauty app called "Maccaron" (note the double 'c'), another was a parking app also named "Macaron." Neither is related to Macaron AI.

This almost got me. I nearly downloaded the wrong app until I noticed the developer name was different.

Fake App Signs

Red flags I look for:

  • Developer mismatch: The real Macaron AI developer is MINDAI. If you see any other name (like "Maccaron Tech" or "Macaron Labs"), it's fake.
  • Scam reviews: I checked Trustpilot for "Maccaron" (the beauty app) and found a score of 2.5/5 with fraud complaints about non-delivery and scams. That's not the AI tool you want.
  • Permission spam: If an app asks for camera, microphone, location, and storage access all at once before you've even opened it, bail. The real Macaron requests permissions on-demand.
  • Sketchy download links: If a site sends you to a download link that's not HTTPS or redirects through multiple domains, close the tab.

Example of what I almost fell for:

I saw a YouTube video titled "Macaron App Download Guide" that linked to a third-party APK site. The comments were full of "This saved my life!" bot spam. The actual app is only available on the official App Store (ID: 6747623785) or macaron.im.

Verification Tips

My paranoid checklist:

  1. Only download from official sources: App Store link or macaron.im
  2. Verify the developer name: Must say MINDAI
  3. Check the privacy policy link: Should point to macaron.im/privacy-policy, not some random Blogspot page
  4. Read recent reviews: As of January 2026, App Store reviews are positive on usability with no privacy complaints
  5. Scan for malware: I ran the app through VirusTotal after downloading (zero flags)
  6. Confirm via Product Hunt: The official launch is documented on Product Hunt and Yahoo Finance

One additional tip from a security researcher I follow on X: use tools like GoPlusSecWareX to check for phishing before clicking download links. I haven't needed it for Macaron since I go straight to the App Store, but it's worth having.


What We Recommend

Bottom line after three months of testing: Macaron AI appears safe based on its privacy-first design, encryption, and positive early reviews. I haven't found any major breaches, user complaints about data leaks, or sketchy third-party integrations as of January 2026.

That said, no app is 100% risk-free. Your job is to minimize unnecessary exposure.

Safe Setup Checklist

Here's the exact setup I use:

Before downloading:

  • ✅ Verify you're on the official App Store or macaron.im
  • ✅ Confirm developer is MINDAI
  • ✅ Read the privacy policy (or at least skim the sections I highlighted above)

First-time setup:

  • ✅ Use a unique password (not reused from other accounts)
  • ✅ Enable 2FA if available (I checked — not implemented yet, but worth monitoring)
  • ✅ Use a personal email, not your work email

After account creation:

  • ✅ Grant permissions only when prompted, deny anything you don't immediately need
  • ✅ Email privacy@macaron.im to opt out of non-essential analytics
  • ✅ Test a data export request to see what's being stored (Settings → Request Data Export)

Ongoing maintenance:

  • ✅ Update the app promptly when new versions drop (security patches matter)
  • ✅ Audit your active tools monthly — delete ones you're not using
  • ✅ Avoid sharing sensitive financial data in chats unless you've verified encryption
  • ✅ Request data deletion if you stop using the app

What I personally keep off:

  • Location access (not needed for any of my tools)
  • Third-party analytics (opted out via email)
  • Corporate email visibility (switched to personal email)

Risk assessment by use case:

Use Case
Risk Level
Recommendation
Habit tracking, meal logging
Low
Safe with standard setup
Travel planning, budgets
Low
Safe; consider denying location
Mental health journaling
Medium
Opt out of analytics; use personal email
Financial data entry
Medium-High
Avoid unless you verify end-to-end encryption
Work-related tools
Medium
Use personal email; don't share proprietary data

FAQ

Is Macaron AI safe to use?

Yes, with caveats. The app uses strong encryption and doesn't sell data, but you should still grant permissions cautiously and opt out of non-essential analytics. Early reviews (as of January 2026) praise usability without privacy issues.

What data does it collect?

Basic info (name, email), user content (chats, uploads), and device/usage data. The key difference from other AI apps: no selling for targeted ads.

Can I delete my data?

Yes. Email privacy@macaron.im with a deletion request; processed per policy. I tested this and got confirmation within 48 hours.

Does it track me?

Usage data may track activity across the app for analytics, but you can object or opt out via email. If you're in the EU, you have stronger rights under GDPR.

What if I spot a fake Macaron app?

Report it to the App Store or Google Play. Verify the developer name (MINDAI) and download links (should be App Store or macaron.im only).

Is it safe for work-related tools?

Depends. If you're building productivity tools that don't involve proprietary company data, yes. If you're handling trade secrets or regulated financial info, consult your IT department first.

How does Macaron compare to ChatGPT or other AI apps on privacy?

Macaron's privacy-first approach is closer to DuckDuckGo than mainstream AI tools. ChatGPT, for example, uses your data to train models unless you opt out. Macaron doesn't train on your chat content.

Can I use it offline?

Partially. Tools that store data locally (like journals or trackers) work offline, but AI-powered features require internet connectivity. I tested this by enabling airplane mode — basic data entry worked, but any tool that generates responses failed.


Related Resources:

Data Sources: All statistics and privacy policy details verified as of January 2026 from Macaron AI Privacy Policy, App Store disclosures, Medium privacy analysis, and personal testing conducted between October 2025–January 2026.

안녕하세요, 저는 Hanks입니다 — 워크플로우 조작자이자 AI 도구 애호가로, 자동화, SaaS 및 콘텐츠 제작 분야에서 10년 이상의 실무 경험을 가지고 있습니다. 제가 도구를 테스트하니 여러분은 그럴 필요 없습니다. 복잡한 과정을 간단하고 실행 가능한 단계로 나누고, '실제로 효과가 있는 것'의 숫자를 파헤칩니다.

지원하기 Macaron 의 첫 친구들