The Walking Dead is a great series. But there are a lot of drama that takes the attention from the main characters: THE ZOMBIES. I know it "humanizes" it, but I think it could've been better. Comics are so awesome.