More so in the last few weeks, in American news it’s just flooded with Dominican Republic stories. Ranging from poisoning to shootings.
(Edit) I know it’s always been bad it’s just it’s getting a lot more attention.
im just legit confused on why now? Like i’ve never seen or hear anything bad happening in the DR in recent years towards tourists especially american ones where a good portion of tourists in the DR are american.