Andrew Ng: Why You Should Still Learn to Code

▼ Summary
– Andrew Ng recommends everyone learn basic AI-assisted coding as a fundamental skill for communicating with computers, even without becoming professional developers.
– AI is transforming software development by making coding faster but creating new bottlenecks in product management, requiring engineers to develop broader skill sets.
– Computer science education is failing graduates by not adapting curricula to include AI collaboration skills, creating a mismatch between job market needs and graduate qualifications.
– Ng believes public AI fears have been exaggerated by corporate PR campaigns and advocates for developer-led public education to improve AI literacy and perception.
– Rather than restrictive regulations, Ng prefers transparency requirements for large AI companies and safe sandboxed environments to balance innovation with responsible AI development.
Navigating the evolving relationship between artificial intelligence and software development requires understanding both the technology’s capabilities and its limitations. At the recent AI Dev conference in New York, hosted by DeepLearning.ai, Andrew Ng shared his perspective on why coding skills remain vital, even as AI tools transform how we write software. While AI has made programming more accessible, Ng believes everyone, not just engineers, should grasp the fundamentals of instructing computers, equating it to basic math literacy in the digital age.
Ng observed that AI has significantly lowered the barrier to creating software, enabling what he terms “AI coding” or “vibecoding.” He hopes this accessibility encourages more people to learn programming basics. “One of the most important skills of the future is the ability to tell a computer exactly what you want it to do for you,” he explained. He emphasized that understanding how to communicate with machines matters more than memorizing complex syntax. Although he welcomes non-developers into the coding community, Ng admitted that relying heavily on AI assistants can be mentally taxing, even for experienced professionals.
The acceleration of software development through AI has shifted bottlenecks from prototyping to product management. Ng suggested that engineers who acquire product management skills can operate more independently, effectively becoming “a team of one.” This idea of professionals becoming generalists resonated throughout the conference. Fabian Hedin, CTO of Lovable, noted that individuals with deep expertise in non-software fields can now iterate much faster by applying coding skills. Laurence Moroney of Arm added that this breaks down silos, allowing niche experts to contribute more broadly. According to Ng, the new challenge for developers lies in conceptualizing what to build, since AI handles much of the implementation. Hedin agreed, pointing out that AI struggles with understanding human intuition, which remains a uniquely human strength.
The job market for computer science graduates reflects these shifts. Ng expressed concern that many universities have been slow to update their curricula since 2022, leaving graduates unprepared for today’s AI-driven coding roles. He described it as “malpractice” for institutions to award CS degrees without teaching students how to work effectively with AI assistants. Overhiring during the pandemic and subsequent cutbacks have made entry-level coding jobs scarce, but Ng stressed that graduates with AI experience are in high demand. “For the fresh college grads that do know those skills, we can’t find enough of them,” he noted.
Public apprehension about AI was another key topic. Ng acknowledged that AI “has not yet won America’s hearts and minds,” attributing much of the fear to PR campaigns by certain businesses aimed at lobbying. Miriam Vogel of Equal AI urged developers to engage with public concerns and improve AI literacy, warning that failure to address these fears could hinder progress. Ng encouraged candid conversations to help the public form rational views about the technology.
Regarding artificial general intelligence (AGI), Ng remains skeptical of alarmist predictions. He argued that current AI models, despite their advances, are far from achieving human-level intelligence, relying heavily on engineered inputs and vast datasets. On safety and governance, he prefers sandboxed environments that ensure safety without stifling innovation, rather than restrictive regulations. Vogel defined governance as translating principles into actionable workflows, expressing particular concern about smaller AI firms that lack established governance frameworks.
Ng critiqued heavy-handed regulatory approaches, such as those in the EU, arguing that leadership in AI isn’t achieved through excessive legislation. He praised the Trump administration’s AI Action Plan for maintaining flexible federal rules. While some experts worry about the lack of U.S. AI regulation, Ng sees few well-designed proposals aside from bans on nonconsensual deepfakes and FTC actions against deceptive AI practices. He advocates for transparency requirements targeting large AI companies, which would help identify issues early without overburdening startups. Reflecting on social media’s problems, Ng emphasized that mandated transparency could provide clearer signals of emerging risks, reducing reliance on whistleblowers.
In summary, Andrew Ng’s insights highlight a future where coding literacy becomes universal, professionals embrace broader skill sets, and thoughtful governance balances innovation with responsibility.
(Source: ZDNET)





