I think it is important not to conflate different notions of information/complexity with their popular interpretation. A basic problem is the mathematical theories that have been cited on this thread relate to a very exact (and from our perspective possible limited) notion of information. The Kolmogorov complexity of a sequence relate to the size of the smallest program that can compute the given sequence . Since a basic theorem in kolmogorov complexity theory is the kolmogorov complexity is incomputable this is not a vey helpfull notion to begin with. A second problem is the kolmogorov complexity actually measure exactly the opposite of what most people here would call "complexity"; for instance if one randomized the sequence of letters in a play by Shakespeare it would (with high probability!) obtain a higher kolmogorov complexity since natural text contain many regularities.
This bring us to shannon information. Shannon information is defined in the study of chanels (sources) of symbols forming messages. These chanels are characterized through their probability distribution of outputs. The shannon information (in bits) of a given symbol is minus the log2 probability of that symbol; any basic application of this to the setting of the universe require us to imagine a probabilistic source of the universe and the "information" would relate to the amount of bits we imagine should be used to describe a universe from that source. I do not know what this would really mean. As before, it has the counter-intuitive notion that a random stream of symbols have larger information than a text (in a basic application).
Alternatively one can define the notion of entropy; for (Shannon) information theory this is simply the expectation of the shannon information of symbols produced from a source; in thermodynamics it is (very basically stated) computed from the number of states the system can be in and is non-decreasing in expectation. These two notions are related and equivalent for many important problems, but it is debatable how well they exchaust each other.
Turning to cosmology, the problem in cosmology is not to explain how the information (in the sense of entropy) arose, but why the universe (apparently!) started out in a state of very low entropy; cosmic inflation is one attempt to address this important problem in cosmology, but i really think this point is missed in this thread.
A further problem (which should be apparent from the above discussion but need emphasis) it is to my mind not very sound to put to much stress on occam-type arguments where an information/complexity content is ascribed to god, the universe and an argument is made that since the sum of their information is larger than the parts there is a problem. This is for three reasons:
Firstly, the information of the joint system is not the sum of the information of the parts unless one assume (information-theoretic) independence of the systems thus complicating the argument a great deal.
Secondly, it is very hard to consider situations in science where this type of inference would lead to consistently selecting the better ideas asides post-hoc. It is for instance the case a universe populated with stars and galaxies (as opposed to these just being spots of light) is as such more complex than a universe consisting of just the solar system. This is not to say there are no serious problems with god as an explanation, i just do not know how to make that argument using information theory.
Thirdly, the shannon/kolmogorov notions of information relate (as formulated) to descriptions of systems in some appropriate discretization. What is the true appropriate discretization is not clear at all. To what extent does the ease at which a system is described to an outsider allow us to rule out if that system really exists or not? the only stringent way I can think of is not as a physical principle, but as a guide for selecting prior probabilities in a bayesian theory of confirmation. However what should be considered physically true or not should not primarily be guided by priors but evidence.