As a worker, you like the idea of being an at-will employee. Yes, it means you can be fired at any time for a legal reason, but it also means you can walk away from a job if you want. You have control over your own career. If you get a better job offer or just decide you don’t want to work for the company anymore, you can move on.
But what if you’re looking for a new job and your prospective employer presents you with an employment contract during the hiring process? Can they require you to sign it if you take the job, thus making it so that you’re contractually obligated to work for the company for a set period of time and no longer really an at-will employee?
Employers can require contracts if they choose to do so
First off, it’s certainly true that no one can force you to do anything. You always have the choice regarding what contracts you sign or what terms you agree to when taking a job. If all you want is at-will employment, that’s how most employers operate, and you can look for a job that fits your preferences.
That said, the employer does not have to hire you without the contract. The best way to begin is by saying why you don’t want to sign and asking if there’s another way to come to an agreement but, if nothing else can be done, they can just hire someone else.
This is why you have to consider every contract carefully. It can change what you’re legally allowed to do, thus altering your rights as a worker. Never sign anything on a whim or without realizing how it differs from at-will employment. Think about the legal process and what is best for your future.