By K Raveendran
The Supreme Court’s revocation of the 2002 provision allowing fresh law graduates to apply for civil judge posts marks the end of a deeply flawed and impractical policy that persisted for over two decades. While the phrase “better late than never” may feel appropriate on the surface, it hardly captures the depth of the issue or the long-standing consequences that arose from what was, in hindsight, an ill-conceived judicial experiment. For 23 years, the Indian judiciary functioned under a decision that ignored the critical value of practical legal experience, permitting individuals with no courtroom exposure, and often no real-world legal understanding, to ascend to positions of great judicial responsibility. It is not merely a matter of policy correction; it is a belated recognition of a fundamental truth: that there is no adequate substitute for experience in the practice of law, especially in the judiciary.
The 2002 decision, which opened the gates for fresh law graduates to become civil judges, was defended at the time as a move to democratize and accelerate the entry of new talent into the judicial system. Yet this justification crumbles when placed under the weight of the consequences it triggered. Law, unlike many other professions, is intrinsically linked to the lived realities of people. A lawyer in the courtroom does not merely argue statutes and precedents; they navigate complex human situations, societal dynamics, and the intricacies of legal institutions. Expecting a fresh graduate, who has never even appeared in court, to interpret, analyse, and deliver just verdicts was an act of undue optimism at best and judicial irresponsibility at worst.
What perhaps is most damning about this decades-long allowance is the silence that surrounded it for so long. Even though the system bore the brunt of inexperienced judges struggling to cope with their responsibilities, it took 23 years for the highest court in the land to admit what was evident to practitioners, litigants, and academics alike—that a foundational period of legal practice is indispensable before one can fairly and competently occupy a judicial role. It is hard to calculate the true extent of damage this policy may have done, but anecdotes and inside accounts paint a troubling picture. Stories have surfaced of young judges writing their orders in pencil, presumably so that mentors or well-wishers from outside the court could later review and correct them. These are not apocryphal tales intended to malign individuals; rather, they expose the systemic inadequacies that such a policy nurtured.
The judiciary, unlike the legislative or executive branches, relies almost entirely on public trust and the perceived competence of its members. Every time an underprepared judge delivers a poorly reasoned verdict or fumbles in the conduct of a trial, that trust erodes. When litigants and lawyers notice that a judge is still learning the basics of procedure, it undermines confidence not just in that individual, but in the institution as a whole. For a system that already grapples with backlog, inefficiency, and resource shortages, inserting untrained adjudicators into its midst was not just misguided—it was detrimental.
The tragedy of the 2002 decision also lies in its disregard for the evolution of professional competency. In nearly every field that deals with people’s lives and futures—medicine, education, engineering—practical experience is a non-negotiable precondition for responsibility. We would not dream of appointing a fresh medical graduate as a head surgeon or allow someone with a teaching degree but no classroom experience to lead a school. Why then was the judiciary, a profession that demands intellectual rigour, psychological maturity, and real-time decision-making, treated differently? The answer may lie in an over-romanticised view of merit and academic brilliance. But legal excellence does not develop in the vacuum of classrooms or the neat world of textbooks; it is forged in the messiness of real cases, tough negotiations, and ethical dilemmas.
Moreover, the decision, though intended to fast-track careers, in fact may have stunted the professional growth of many young appointees. Without adequate grounding in actual legal practice, many of these judges likely lacked the intuitive understanding of legal processes that only comes from experience. They may have had to rely excessively on court staff or senior clerks, further entrenching existing hierarchies rather than democratizing the judiciary. Their early entry into the judiciary may have seemed like a leap forward, but in reality, it may have been a premature push into waters they were not yet prepared to navigate. The results were predictable: hesitancy, poor decision-making, and an excessive dependence on informal support systems.
It is worth noting, too, that the responsibility for this lapse does not rest solely with those who designed or defended the 2002 policy. The judicial community as a whole bears some responsibility for not confronting the obvious shortcomings of this approach earlier. There was no shortage of data, research, or testimony that could have compelled a re-examination of the decision much earlier. Yet institutional inertia and an unwillingness to admit past mistakes allowed the policy to persist long after it had outlived any presumed utility.
One must also consider the silent victims of this flawed policy—the litigants whose lives and futures depended on fair and competent adjudication. Justice delayed is justice denied, but justice delivered incompetently is arguably worse. How many individuals suffered miscarriages of justice at the hands of inexperienced judges who were still learning on the job? How many appeals had to be filed, how many judgments overturned, and how many careers damaged before the systemic impact of the policy became undeniable? These are not mere statistical questions; they are deeply human ones, with implications that reach into every corner of society.
The Supreme Court’s decision to finally revoke the provision is, therefore, not a moment of celebration but of sober reflection. It reminds us that even the most august institutions can err and that the costs of such errors can be both deep and enduring. In acknowledging the mistake, the Court has taken a crucial step—but the legal fraternity must now commit to ensuring that lessons are learned and embedded in future policymaking. Reforms in judicial appointments must be guided not by expedience or ideological aspiration, but by a clear-eyed understanding of what makes a good judge. (IPA Service)