- Category: Answers to Your Questions
- Published: Saturday, 01 March 2014 13:27
- Written by Doug Hartley
- Hits: 1497
No, dealerships do not have to provide you with any insurance coverage. They should have insurance to cover their cars while they are in their possession and while you are test driving a car. But, once you sign on the dotted line, they hand you the keys, it's your responsibility to get and keep auto insurance on your car. Now, that said, they might help you secure insurance on your car. Maybe they help you buy insurance from an agent or company they work with and perhaps finance the cost of the down payment into the loan. But, that's entirely up to them and you if you want to negotiate a deal. They want to close the sale. Getting you insurance so you can leave with your car is in their best interest. Where they get that insurance may not necessarily be in yours. If you have insurance already, it might be as simple as adding that car to your policy. If you don't already have insurance, take some time and shop around. Make sure your budget when buying a car includes paying the down payment on insurance.