Step 4 of 4

Take Action

When something is detected, you have options. Most threats are handled automatically, but for serious cases, you can take additional action.

โšก
Most Actions Are Automatic

Based on your preferences, unauthorized use is blocked automatically on compliant platforms. You only need to act for edge cases.

Action Options

What you can do when something is detected.

Report & Remove

For content on compliant platforms, request immediate removal with one click.

24h
Avg removal time
98%
Success rate

Mark as Legitimate

Sometimes our AI flags content that's actually okay (like news articles or legitimate references). Mark it as legitimate to improve detection.

Escalate

For serious violations (scams, explicit content, harassment), escalate to our legal partners for formal action.

  • โ†’ Cease & desist letters
  • โ†’ DMCA takedown requests
  • โ†’ Law enforcement referral (for fraud)

What To Do If You're Scammed

If you or a family member falls victim to a voice clone or deepfake scam.

  • 1๏ธโƒฃ
    Report to LMIF

    Log the incident in your dashboard. We'll help document evidence and track the source.

  • 2๏ธโƒฃ
    Contact Your Bank

    If money was sent, contact your bank immediately. Many have fraud protection for AI-related scams.

  • 3๏ธโƒฃ
    File a Police Report

    Report to local law enforcement. AI-generated fraud is a crime in most jurisdictions.

  • 4๏ธโƒฃ
    Alert Family Members

    Warn others in your circle. Establish a secret code word for emergencies.

Prevention Tips

Best practices to reduce your risk.

For You

  • โœ“ Limit voice recordings on social media
  • โœ“ Be cautious with video call requests from unknown numbers
  • โœ“ Use privacy settings on social media photos
  • โœ“ Establish a family code word for emergencies

For Elderly Family Members

  • โœ“ Teach them about voice cloning scams
  • โœ“ Establish a code word only family knows
  • โœ“ Encourage them to hang up and call back
  • โœ“ Never send money based on phone calls alone

Legal Resources

Know your rights as legislation evolves.

TAKE IT DOWN Act

US law criminalizing non-consensual intimate AI imagery.

State Laws

Many states have enacted deepfake and voice cloning protections.

International

Denmark and others leading with comprehensive likeness protection.

๐ŸŽ‰
You're Protected!

Your identity is claimed, preferences are set, and monitoring is active. You're ahead of most people in protecting your digital identity.