Interviewer: Interesting. Cool, but do you think we can do it on a digital substrate? Do you think we can get something that is as capable and as intelligent and has these cognitive capacities as good as humans. Currently, humans run on wetware, we run on neurons, which fire yes or no, but can we do that same sort of thing on chips?
Interviewee: I don't think so, no. Because what we model we don't explicitly model the same way it happens in our brain, right, it's all still approximation, and we still don't know for sure how it all happens in our brain, so I think we're still missing many details from it.
(from: Interviews with AI Researchers)
Consider that current AI systems can do things that would have seemed magical just a few years ago. For example:

GPT-3 (2020) is able to write convincing articles, generate code based on natural language, and perform well on a wide range of text-based tasks.
PaLM (2022) improved upon GPT-3 and other cutting-edge systems mostly by increasing the number of parameters and training on better hardware—without revolutionary new insights into the nature of human intelligence.
- PaLM improves upon the state-of-the-art in 28 of 29 tested NLP tasks.
- PaLM is able, among other things, to explain an original joke in two-shot prompts.
- In code generation, PaLM equals the performance of Codex 12B (which was fine-tuned on that task) while using 50 times less Python code.
There's no fundamental reason why calculations made in biological brains can't be made by silicon chips. Many things people once speculated that only biology could do—creating art, understanding humor, eloquent language—have already been achieved by modern AI.