By Guest Contributor

The future is here and so is Artificial Intelligence (AI). Scammers are constantly looking for new ways to defraud you and enhance their technique. The Federal Trade Commission recently warned that con artists could use AI to clone a voice of a loved one. This technology is called deepfake, which is fabricated media that has been digitally altered. All that is needed is a short audio clip, which is quite easy with so much content posted online these days.
 
Tips to protect yourself:
 
1. Don’t trust the voice
2. Call the person directly and verify
3. Use a number that you know is theirs
4. Scammers typically ask for payment via gift cards, money transfers, and cryptocurrency because they are much harder to trace
5. Don’t be pressured into making an emotional response
 
Source: ftc.gov and AgeWell MIddle Tennessee