Let's say you make widgets, which sell for $1.
Let's say you're paying people $7.50/hour, and the minimum wage is set to $15.
Before the hike, they could afford 7.5 widgets for every hour they worked.
Let's say your wage costs are 70% of your total price point, which is enormously high (restaurants typically operate in the 20-30% range, and they trend high).
This means out of every widget, $0.70 went to pay salaries.
Let's also say every single employee was at the same salary, which won't be true in practice, but this makes the argument worse for me, because the wage hike affects more of your price point.
So, the doubling of your wage costs means instead of costing you $0.70 per widget, it's costing you $1.40. To maintain everything else, you keep the remaining $0.30; your total price is now $1.70 per widget, and profits are the same; you've passed the cost along to customers.
Now, let's look at how many widgets your employees could buy per hour of work. Remember, before, it was 7.5 widgets per hour. Now, at $15/hour in wages, and $1.70/widget, they can afford about 8.8 widgets per hour of work.
Their purchasing power has increased.
The
only way this could work out differently is if more than 100% of your price point was wages, meaning not only are you making no profits, you're losing money. Otherwise, 100% will always be higher than whatever percentage of your price point goes to wages.
Hence, mathematical impossibility. This is basic shit. I assume you don't work in accounting at your firm.
- - - Updated - - -
Then your source is wrong.
Here's info about running a restaurant;
https://upserve.com/restaurant-insid...es-restaurant/
Labor costs should be 20-30% of gross revenues. Your source, by your own claim, sets the lower bound at 40%. Either you're reading it wrong, or your source is wrong.