Now to give this one a less glib response than before. What you really mean, I think, is that a program is limited to the specifications that a programmer has imposed upon it. The machine cannot do anything the programmer hasn't thought of (defects in the program notwithstanding). As a programmer, I know darn well that this is true.
Any machine you make can only be as intelligent as its creator or creators.
But the human brain is no different. The human brain cannot do anything that it isn't programmed to do. For example, the brain's vision system is programmed to work a certain way. You can't just change how you see things. You can't choose to see a leprechaun if one isn't actually there. You might ingest a mushroom and suddenly you do see a leprechaun, but all you're doing is interfering with your brain's programming.
And yes, your brain can learn to do new things, but so can a software program. Why couldn't a sufficiently smart program just write a subprogram to handle an unfamliar task? That's not unlike what the human mind does, though we do it unconsciously. Watch an EEG (a type of brain scan) of somebody learning to do something and you can see changes in the brain as the subject learns to perform the task better. What the brain is doing is optimizing itself for the task, organizing things so that the information can be processed faster and better. It is reprogramming itself.
On what grounds?
This is the illusion caused by the true Wizard of Oz hiding behind the curtain in the back of the room. We can make it seem real, seem possible, but there is one part science can't seem to emulate. Machines can not be conscious because we can not replicate consciousness.
Because humans evolved to have it. Humans are at the top of the food chain. We have our pick of whatever species we want to eat, and no other species can eat us so long as we are properly prepared for the possibility (for instance, carrying a powerful rifle if you think you might soon encounter a tiger). This is obviously a strong evolutionary advantage; humanity's not going anywhere anytime soon, barring something like a surprise meteor, running out of natural resources, or World War III. But look at us. We're not huge. We're not strong. We don't run fast. We don't have tough natural armor. We have absolutely nothing going for us that other species don't do better... except our brains. They didn't evolve all at once, of course. We started with simple things such as tool use (which has been observed in animals, especially birds and apes) -- aha, we can kill prey more effectively if we throw a spear at it; we can improve the quality of meat by cooking it -- and so our dominance grew a bit. Then we observed that we wouldn't have to spend so much time searching and running if we simply bred our prey instead of hunting for it, and so we invented farming. We also applied much the same idea to plants. Now societies started to form and people had to learn how to get along with one another. Every smart idea that humanity came up with increased its dominance, and so, in accordance with the theory of natural selection, smarter populations flourished while less smart populations faltered.
"But hypothetically" you say. Well if you can "hypothetically" give humans the ability to give machines consciousness, "hypothetically" there must have been some being who gave humans consciousness. The Machine didn't just wake up with it. Why would the human?
Consciousness is just a side effect of being smart. I don't see how you could be smart -- able to learn new things, able to explore new ideas on your own volition -- without being sentient. What would it even mean?
I didn't say it did. My entire point was that science cannot prove it wrong (or right). The existence or nonexistence of a divine being is simply not a scientific matter at the moment.
Ignoring the matter does not prove it wrong.