Optimal foraging theory predicts that predators are selective when faced with abundant prey, but become less picky when prey gets sparse. Insectivorous bats in temperate regions are faced with the challenge of building up fat reserves vital for hibernation during a period of decreasing arthropod abundances. According to optimal foraging theory, prehibernating bats should adopt a less selective feeding behavior--yet empirical studies have revealed many apparently generalized species to be composed of specialist individuals. Targeting the diet of the bat Myotis daubentonii, we used a combination of molecular techniques to test for seasonal changes in prey selectivity and individual-level variation in prey preferences. DNA metabarcoding was used to characterize both the prey contents of bat droppings and the insect community available as prey. To test for dietary differences among M. daubentonii individuals, we used ten microsatellite loci to assign droppings to individual bats. The comparison between consumed and available prey revealed a preference for certain prey items regardless of availability. Nonbiting midges (Chironomidae) remained the most highly consumed prey at all times, despite a significant increase in the availability of black flies (Simuliidae) towards the end of the season. The bats sampled showed no evidence of individual specialization in dietary preferences. Overall, our approach offers little support for optimal foraging theory. Thus, it shows how novel combinations of genetic markers can be used to test general theory, targeting patterns at both the level of prey communities and individual predators.