How do you feel about unions and unionizing in general.
I'm just curious, as in the United States there appears to be a large scale "De-unionizing" happening in the in working population. Traditional highly unionized states like Michigan have gone right to work, while court cases in states like California are arguing for opt-out rights for state employees (specifically teachers) and in Illinois recently home healthcare workers won the right to not be forcibly unionized by the state under the SIEU.
However, as a counter point to that activity, organizations like SIEU, Center for American Progress, Media Matters and UAW have been working to try to increase union membership in low wage/low skill positions (such as fast food) banking industry and currently foreign auto-makers (which recently failed in the VW plant in Tennessee).
Historically in the mid to late 19th and most of the 20th centuries in the United States, unionizing was see as necessary because of poor safety, abusive employers (robber barons and such), however now in the late 20th and 21st centuries there is a high level of regulation for most industries, with regards to work conditions, compensation, safety and compliance requirements that many of the hallmark benefits of being in a Union are no longer an issue (safety, reasonable pay and better work conditions), and are mandated by the government.
Most union contracts now seem to be more about protecting long term employee's from being fired, increased benefits towards better health insurance plans and ensuring certain components or production lines are produced in certain facilities.
There is also the issue of in the mid 20th century that quite a bit of fraud and illegal activity was tied to large union concerns, with ties to crime syndicates and mob activity.
I personally think that Unions are useful, should be optional, and that as a whole the fraud claims are overblown, but that there should be some oversight to ensure fairness and legality for all parties involved.