This is a somewhat obscure Ruby on Rails question, but I thought I’d ask it anyway. Is it the case that the code that reads test fixtures into your database ignores any settings you apply to your models? It sure looks to me like if you point a model at a table name other than the default using
set_table, or override the default connection information to associate a particular model with a different database, the part of the testing framework that reads the fixtures just ignores it completely. That seems like a somewhat huge limitation to fixtures, which are otherwise very useful.
For those of you who don’t use Ruby on Rails, fixtures provide a way to easily seed your unit tests with data, as well as wipe out data between tests. The way Rails testing usually works is that you create a database that’s just for testing and set up some fixtures with test data, that way you can quickly and easily run your unit tests against a clean database to insure that data errors aren’t causing your code to break and to isolate your test data from data that might actually be useful.