3 Year GDPR Round-Up: Closer Look At Select Key Fines

By Arya Tripathy and Dhruv Suri on July 29, 2021

On May 25, 2021, the European Union General Data Protection Regulations completed 3 years. In these 3 years, multiple orders have been pronounced including somewhere high fines have been imposed. Article 83 of GDPR provides for two kinds of administrative fines depending on the nature of default:

  • breach of obligations relating to child’s consent for information society services (Article 8), processing without identification (Article 11), data protection by design and default (Article 25), other processing related obligations, breach of codes and certifications, and obligations of data protection officer (Article 39) – fines up to EUR 10 million or in case of an undertaking, up to 2% of the total worldwide turnover of the preceding fiscal, whichever is high; and
  • breach of basic processing principles (Article 5), requirements for lawful basis of processing (Article 6), conditions for consent (Article 7), obligations for processing special categories of data (Article 9), rights of data subjects (Articles 12 to 22), and conditions for cross-border data transfer (Articles 44 to 49) – fines up to EUR 20 million or in case of an undertaking up to 4% of the total worldwide turnover of the preceding fiscal whichever is higher.

Since 2018, authorities have imposed fines on a variety of breaches- from retailers misrepresenting manner of using CCTV cameras to monitor employees, to denial of subject’s right to be forgotten. Amounts collected through administrative fines are then used to fund public services and increase awareness around privacy and data protection.

In this post, we take a closer look at 5 landmark GDPR cases and analyse some important takeaways for data processing practices.

1. Google LLC – EUR 50 million[1]

“Despite the measures implemented by GOOGLE (documentation and configuration tools), the infringements observed deprive the users of essential guarantees regarding processing operations that can reveal important parts of their private life since they are based on a huge amount of data, a wide variety of services and almost unlimited possible combinations” – CNIL Press Release

On June 19, 2020, Conseil d’Etat, the French highest administrative court upheld the fine of EUR 50 million imposed on Google LLC by the French Data Protection Authority, CNIL. The fine was imposed for breach of GDPR on 2 primary grounds – failure to (i) provide notice in an easily accessible form, using clear and plain language, when users configure their Android mobile devices and create Google accounts, and (ii) obtain user’s valid consent to process personal data (PD) for ad personalisation.

BackgroundThe case initiated with complaints filed by 2 not-for-profit organisations in 2018. CNIL investigated the complaints. It observed that Google followed a multi-level approach. At first level, users were provided with confidentiality rules and terms of use that indicate the main purposes of processing. At the second level, there were hypertext links that would allow users to access more comprehensive information. Once an account was created, other tools were made available to the user to manage privacy settings. CNIL ruled that the information provided was unclear and not easily accessible as essential information about processing such as purposes, retention periods, type of personal data processed for ad personalisation were scattered across several pages. Further, CNIL observed that some information was described vaguely, denying users their right to understand the extent of processing carried out by Google. Regarding consent, CNIL found that consent was not validly obtained for ad personalisation due to lack of specificity. Basis this, CNIL thought it proportionate to impose a fine of EUR 50 million.

Google went in appeal and contended that (i) its main establishment was located in Ireland and Irish data protection authority was the competent authority under GDPR instead of CNIL, (ii) CNIL did not follow cooperation and consistency procedures as it failed to consult European Data Protection Board, (iii) CNIL committed errors of law it its findings on breach of transparency and consent requirements under GDPR, and (iv) fines were disproportionate because all assessment criteria required under GDPR were not taken into account.

Findings: For ease, we analyse the finding per contention.

Contentions (i) & (ii): Article 55 of GDPR states that each supervisory authority is competent to exercise powers in its member state. Article 56 creates special provisions where an organisation has different establishments in different member states. It provides that without prejudice to Article 55, the supervisory authority of the controller’s “main establishment” will act as the supervisory authority. Main establishment means the place of central administration within EU, unless the purposes and means of processing are decided in another establishment of the controller in EU, in which case the latter establishment will be the “main establishment”. Conseil d’Etat observed that main establishment should presuppose “effective and real exercise of management activities” determining the decisions regarding means and purposes of processing. The outcome of which supervisory authority will be the lead authority will be different if the decisions regarding means and purposes are taken outside EU. In such cases, the question for application of Article 56 and determination of lead supervisory authority does not arise, and each member state’s supervisory authority will be competent to look into GDPR violations under Article 55. Conseil d’Etat ruled that Android operating system was exclusively developed and operated by Google LLC in USA, and Google Ireland did not exercise any control over processing or other Google subsidiaries in EU as a central place of administration. Accordingly, Google LLC did not have a main establishment with EU, Article 56 could not be implemented, and CNIL was competent to investigate. It further observed that there was no fallacy in coordination and mutual assistance process under GDPR, as other supervisory authorities including Irish data protection authority did not have a dissenting view on whether Google had its main establishment in EU and in Ireland specifically.

Contention (iii): Articles 12 read with 13 and 14 of GDPR provide transparency and information obligations of controller. Article 12(1) mandates the controller to provide information under Articles 13 (which specifies what information should be provided at time of collection) in a concise, transparent, understandable and easily accessible manner, in clear and simple terms. The overall scheme is clear – adequate information must be provided to users so as to enable them to determine the scope and consequences of processing, without causing consent or information fatigue. Conseil d’Etat noted that Google’s multi-level approach was not in conformity with GDPR and observed that:

  • first level was generic when examined in light of the processing operations, the degree of intrusion into privacy, and the volumes of data collected;
  • essential information relating to certain processing were only accessible after numerable steps, or from hyperlinks which are difficult to access (six steps to understand what information is processed for ad personalisation, geolocation and retention); and
  • information provided was incomplete or insufficiently precise.

Resultantly, the notice structure was detrimental to accessibility and denied clarity to users.

Contention (iv): Article 6 provides that processing is lawful only if conditions therein are met, one of which is data subject’s consent. Article 7 states that if consent is the basis of processing , controller must be able to demonstrate “valid” consent. Consent should be manifestation of will, free, specific, informed and unequivocal and obtained from the subject through a clear “positive act”. The consent declaration must distinguish matters for which consent is obtained. It cannot be implied and cannot be clubbed with other conditions of use. Conseil d’Etat observed that while creating a Googe account, the user is first presented with “Privacy Rules and conditions of use”. These provide brief and very general information, and the user has to click on the “more options” link for details. If not, user has to check the boxes “I accept the terms of use of Google”, and “I agree that my information will be used as described above and detailed in confidentiality rules”. If user clicks on “more options”, a page offers them to configure their account. For personalisation of ads, a pre-checked box exists which the user has to uncheck. If user wishes to learn more about personalisation of ads, they have to click on “to know more” which specifies the method of display of personalised ads. However, even this link does not provide exhaustive information. In essence, the consent architecture is designed to indicate that personal data will be processed as per default settings. Accordingly, Conseil d’Etat ruled that consent obtained was invalid as there was lack of sufficient prior information, omnibus consent was obtained for all purposes, and there was no positive act on the user’s part to indicate consent

Contention (v): Finally, Conseil d’Etat ruled that factoring the gravity of breach, the period for which it continued, the degree and extent of processing and other factors, the fine was proportionate.

Key takeaways:

  • Provide granular, succinct, clear and crisp information/privacy notice in easy and accessible format before collection; multi-level and bundled forms do not meet the mark
  • Obtain specific consent for specific purposes, requiring the user to perform positive act for signifying consent
  • Do not club with other terms of use
  • If processing is connected with marketing and promotion purposes, pay closer attention to the consent methods deployed; judicial tendency is to view such processing as purposes that are best justified if there is valid consent. Lawful contract or legitimate business interest may not be considered as proper processing basis

2.  H&M Foundation – EUR 35 million[2]

“This case documents a serious disregard for employee data protection at the H&M site in Nuremburg. The amount of the fine imposed is therefore adequate and effective to deter companies from violating privacy of their employees.” – Prof. Dr. Johannes Caspar, Hamburg Commissioner for Data Protection & Freedom of Information

On October 01, 2020, the Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI) imposed EUR 35,258,707 fine on H&M Hennes & Mauritz Online Shop A.B. & Co KG (H&M) for GDPR breaches.

Background: At its Nuremburg service centre, H&M had been extensively recording private details about its employees and storing them on network drive since 2014. When employees returned from leaves, supervisors conducted “Welcome Back Talks”. These were being recorded, including details of vacation, symptoms of illness and diagnoses, without knowledge and consent of the concerned employees. Some were also processing information about family issues, political and religious beliefs through these informal talks. The collected information was digitally stored and readable by up to 50 other managers. Further, the information was recorded over long periods of time, enabling meticulous evaluation of individual as well creating detailed employee profiles for decisions regarding their employment. The above practices came to light when the data became accessible on company wide network for several hours in October 2019 due to a configuration error.

Findings: HmbBfDI took suo moto action upon receiving knowledge from press reports and immediately required contents to be frozen. H&M handed over about 60 gigabytes of data for HmbBfDI’s evaluation. After reviewing this data, HmbBfDI observed that such processing was an “intensive encroachment” on employee’s civil rights. While most of the data processed may be harmless, political opinions and religious beliefs qualify as special categories of data under Article 9 of GDPR.  H&M provided an express apology for the breach and provided HmbBfDI with a comprehensive roadmap for future data protection measures including appointment of a data protection coordinator, monthly updates, enhanced awareness about whistleblower protection and manner of dealing with access requests. H&M also suggested paying compensation to employees. In light of these proposals, HmbBfDI observed that this was a case of “unprecedented acknowledgment of corporate responsibility” following a breach incident.

Key takeaways:

  • While processing special categories of data, identify suitable grounds under Article 9. It prohibits processing unless the basis is one of the 10 grounds listed therein and the associated conditions are fulfilled.
  • When dealing with employee data that qualifies as special PD such as biometrics for attendance and leave, background verification, medical records, etc., it is imperative to apply data inventorization techniques to identify special PD and possible basis for processing.
  • Some of Article 9 grounds that should have been evaluated by H&M before processing are (i) explicit consent of subject for one or more specified purposes, as long as member state law does not prohibit processing, (ii) necessary for carrying out rights and obligations of controller or data subject for employment, social security, social protection law if authorised under EU or state law, or collective agreement between member states, provided apt safeguards are put in place for subject’s rights and interests, (iii) necessary to protect vital interests of subject or other natural person, where subject is physically or legally incapable of giving consent, (iv) relates to PD that is made public by subject, (v) necessary for legal claims or judicial activity of courts, (vi) preventive or occupational medicine for assessing working capacity of employee, medical diagnosis, provision of health or social care systems and services basis EU or state laws, or pursuant to contract with health professional.
  • The grounds relied, purpose sought to be achieved, and the extent of processing needs to be proportionate and cannot override the data subject’s rights and freedoms. Careful balancing is required while following the core principles of purpose limitation and data minimisation. In this case, it appears that the threshold for Article 9 grounds were not met and, consequently, the processing was unlawful.
  • Evaluate the need for data impact assessment if processing involves systematic and extensive evaluation of personal information, including profiling of data subject.
  • Cooperate with the investigation process as it could mitigate the quantum of fine.

3. Telecom Italia – EUR 27.8 million[3]

“TIM were proven to be insufficiently familiar with fundamental features of the processing activities they performed (accountability)” – Press Release, Italian Supervisory Authority

Italian supervisory authority, Garante per la protesione dei dati personali (Garante) fined TIM SpA (TIM) with EUR 27,802,496 for several instances of unlawful processing relating to its marketing activities, affecting millions of individuals.

Background: Garante received a series of data subject complaints against TIM about (i) unsolicited marketing calls without consent, including those cases where subject had refused consent or were registered in the public opt-out register, (ii) unfair processing practices regarding prize competitions conducted by TIM, (iii) lack of response on subject’s requests for right to access PD and disallowing TIM to process data for promotional purposes, (iv) compelling  consent for activation of certain services on TIM’s website, (v) absence of a specific consent form that compelled subjects to consent to all purposes listed therein. Garante also received various breach notifications. Expansive investigations were conducted into these GDPR violations between November 2018 to February 2019.

Findings: Investigations revealed multiple defaults, including the following:

  • targeting of non-customers as “prospects” in course of promotional campaigns without any consent or other suitable legal basis, with at least one instance, where the subject was contacted 155 times in a month;
  • serious discrepancies and delay in timely updating “black-lists” pertaining to subjects who had exercised their right to object processing and refused to receive promotional calls, with some refusals being updated after delay of 200 to 300 days;
  • access by customer care operators to past customer’s PD for prolonged periods;
  • retention of past customer’s PD for more than 10 years without any basis;
  • obtaining forced consent for promotional and marketing purposes from customers desirous of participating in TIM offers for certain special advantages (discounts and participation in prize competitions) and for using TIM applications (like My TIM, TIM Personal), when such consent was not essential for the offer or use of the app;
  • providing subjects with singular “indistinct” consent in a self-certification format that listed various processing purposes such as statistical, promotional and profiling; and
  • frequent data breach incidents such as incorrect PD being attributed to subjects, inaccurate consent status, etc. that resulted in inaccurate treatment of PD, combined with delayed identification and inadequate management of such incidents

Taking into account investigation findings, Garante observed that there was no basis under Article 6 for processing of PD for promotional purposes. TIM should have collected PD with subject’s consent. Garante also concluded that there was no specific, documented, unequivocal consent. With respect to other basis of processing under Article 6, it observed that there was no legitimate interest that TIM could rely on to substantiate processing. Article 6(1)(f) allows controllers to process PD for legitimate interest, provided it does not override subject’s interest or fundamental rights that require PD protection.  Garante explained that legitimate interest cannot substitute consent or impart legitimacy to past consent problems. Legitimate interest must be determined upfront. Additionally, while relying on legitimate interest, due account must be given to subject’s reasonable expectation for processing based on existing relationship such as employment, existing customer, etc. which was not the case for all “prospects”. Garante also noted that right of data protection and “peace of mind” are important subject interests that cannot be overridden by legitimate interest.

With respect to processing of data for availing TIM offers and using apps, Garante ruled that TIM’s consent practices were unlawful as there was no free consent. TIM failed to provide requisite information in a correct and transparent manner. There was also lack of clarity in “content” and “wording” regarding actual data processing carried out by TIM. In fact, even the manner of obtaining combined consent was contrary to principle of freedom and specificity, thereby constituting a breach of Articles 6 and 7. Furthermore, Garante observed that TIM had failed to exercise control over its processing partners as well as failed to put in place adequate and effective measures to evaluate the nature, scope, context and purpose of processing.

Article 5(1)(f) obligates integrity and confidentiality of PD. Article 32 obligates controllers and processor to implement technical and organisational measures to ensure security of personal data commensurating with the potential risk that could arise and considering circumstances such as state of existing technology, implementation costs, nature, scope, context and purposes of processing. These measures could include pseudonymisation and encryption and ability to ensure ongoing confidentiality and resilience of processing systems, ability to restore access to personal data in the event of an incident, and process of regularly evaluating effectiveness of technical measures. Factoring the discrepant data management practices such as delay in complying with subject’s rights, delayed black-list and consent updation, mismanaged flow of updated PD to processors and frequent data breach incidents, Garante ruled that TIM was in repeated breach of its integrity and confidentiality obligation under Article 5(1)(f) and that of implementing adequate security measures under Article 32.

Apart from the massive fine, Garante handed down 20 directions to set the limits within which TIM could continue processing data, including numbering of referenced subjects, timely verification of consistency of black-lists, implementation of technical and organizational measures regarding management of subject requests, review of procedure used by all applications, deletion of data and strengthening measures to ensure quality and accuracy of data.

Key takeaways:

  • Processing of PD for marketing and promotional purposes should be with consent.
  • Legitimate interest basis should weigh in reasonable expectation of privacy and data protection of the subject, and cannot be used retrospectively.
  • Basis of processing must be identified and notified upfront.
  • Consent should be free and unconditional, and processing must have nexus with purposes.
  • Consent needs to be simple, easy and clear.
  • Establish adequate technical, organisational, and managerial measures to honour subject’s rights request.
  • Processing PD on inaccurate and out-dated consent or PD could be potentially risky and indicates lack of appropriate security measures.

4. British Airways plc – EUR 23 Million[4]

When organisations take poor decisions around people’s personal data, that can have a real impact on people’s lives. The law now gives us the tools to encourage businesses to make better decisions about data, including investing in up-to-date security” – Elizabeth Denham, Information Commissioner

On October 16, 2020, UK’s Information Commission Officer (ICO) imposed a hefty penalty of about EUR 23 million[5] on British Airways (BA) pertaining to data breach incidents that occurred between June – September 2018.

Background: BA’s data base was accessed by the cyber-attacker using compromised credentials for Citrix remote gateway access. Upon access, they edited BA website’s JavaScript file and extrapolated customer details to a third-party domain. BA provided remote access to some of its IT applications to authorised users. The cyber-attack began with attacker obtaining access to login credentials of an employee of Swissport, a third-party cargo services provider to BA, based in Trinidad and Tobago. The hacker potentially accessed PD of 492,612 customers and staff, including name, address, payment card numbers and CVV numbers, usernames and passwords, and PIN. This breach was not identified in June 2018, and was discovered only after September 5, 2018 by a third-party alert. BA immediately notified ICO and the investigation commenced.

Findings: Investigations revealed that the authorised users could remotely access through single username and password, which means there was no multifactor authentication. The login details were stored in plain text, and as such anyone with the domain access could have used them. Building on this loophole, the hacker recurringly logged into different servers and mined valuable PD, which also was stored in plain text. This included card details, which had a limited storage purpose for testing systems. Consequently, the data was being redirected to the attacker-controlled website and each time a customer entered payment card information on BA’s website, a copy was being received by the attacker.

BA submitted that the attacker launched tools and scripts that Citrix would have ordinarily blocked and conducted network reconnaissance. Network reconnaissance refers to techniques which help identification of network vulnerabilities and provides information about the network itself. Through reconnaissance techniques, hacker obtained access to a file containing username and password of privileged domain administrator account, which was then used to different servers. In essence, BA contended that it had taken reasonably practicable steps to protect login details. However, BA admitted that card data should have been stored in encrypted form and the plain text was a human error.

ICO ruled that as a data controller, BA failed to process customer PD in a manner that ensured appropriate security resulting in breach of Articles 5(1)(f) and 32 of GDPR. While assessing appropriate level of security, account must be taken of the risk of accidental or unlawful destruction, loss, alteration, and unauthorised disclosure of, or access to PD. The operative word for determining whether Articles 5(1)(f) and 32 are breached is “appropriate”, which is a factual question; this  takes into account existing standards and guidance. ICO ruled that BA could have implemented several measures to mitigate or prevent the attack risk such as limiting access of users, undertaking rigorous testing through simulation studies on business systems, using multi-factor authentication, whitelisting of applications, hardware and sever hardening processes, use of hardcoded passwords or encryption of scripts. On the quantum of fine, ICO held that the fine was appropriate after taking into account the mitigation steps taken by BA once the breach was discovered (such as removing vulnerabilities within 90 minutes, immediately reporting to ICO) and economic impact of COVID-19 on BA’s operations.

Key takeaways:

  • Security measures should consider the possibility of human error and how to minimise it.
  • Measures must commensurate with data assets and nature of operations.
  • Resilience of security measures is not 1 time exercise, rather ongoing.
  • Systems should be simulated and rigorously tested and adequacy should be evaluated in light of existing standards, state of technology, risk and harm that can ensue, and what mitigation steps are in place.
  • Access control forms a key part of information security and is of heightened importance with remote working culture.
  • Use of plain text should be minimal.

5. Marriott International Inc. – EUR 21.3 Million[6]

“When a business fails to look after customers’ data, the impact is not just a possible fine, what matters most is the public whose data they had a duty to protect” – Elizabeth Denham, Information Commissioner

On October 30, 2020, ICO imposed EUR 21.3 million[7] fine on Marriott International Inc. (Marriott) for cyber-attack on Starwood Hotels and Resorts Worldwide Inc. (Starwood) in 2014 resulting in breach of PD of 339 million guests. Marriott acquired Starwood in 2016,and the attack remained undetected till September 2018. Post the acquisition of Starwood, Marriott kept its computer systems separate but made enhancements to existing IT network security. The unification process was to be spread over 18 months and the attack was thus, limited to Starwood systems. Finally, the penalty imposed pertained to the period after May 25, 2018 (i.e. from when GDPR came into force) and not for periods prior to that.

Background: The attacker installed a web shell on a Starwood network device on July 29, 2014, which was used to support an Accolade software app. This app allowed employees to request changes to content on Starwood’s website. The web shell enabled the attacker to get remote access, edit contents and install remote access trojans (malware that permits the attacker to perform more actions than a normal user) to get unrestricted access to all connected devices. Thereafter, over a period of time, the attacker installed and executed “Mimikatz”, a post-exploitation tool to harvest login credentials and exfiltrate guest data. Entire files were transported to hacker-controlled network. Breached PD contained both encrypted and unencrypted data, including guest ids (that could lead to identified information), encrypted passport numbers, card expiration dates, etc. After acquisition, memory-scraping malware was used for extracting card details. One such attempt in September 10, 2018 triggered an alert on the Guardium systems (comprehensive data security platform offered by IBM) which were applied to card databases (CDE). Subsequently, Marriott activated its information security and privacy incident response plan and started deploying real-time monitoring and forensic tools on 70,000 legacy Starwood devices. This aided in detecting compromises and Mariott finally reported the incident to ICO on November 22, 2018. After discovery of the attack, the integration and decommissioning process was expedited and completed on December 11, 2018.

Findings: In its representation to ICO, Mariott submitted that it only carried out limited due diligence on Starwood’s data processing systems and databases. It relied on representations made about the adequacy and resilience of existing security measures. ICO observed that Marriott breached Articles 5(1)(f) and 32 of GDPR as it failed to process PD in a manner that ensured appropriate security. ICO further observed that Marriott deployed insufficient tools to monitor user account and activities, especially privileged accounts. It also observed deficiencies in security alerts on databases, failure to aggregate logs, limited use of Guardium alert for selected actions, absence of multi-factor authentication on CDE datasets and other such lapses resulted in inadequate monitoring of databases that allowed exporting of entire databases. ICO also observed that taking into account scale and operation of Marriott, it ought to have used encryption tools on all passport numbers, software whitelisting on critical devices, and other controls on critical data assets. The breach risk was an identifiable risk and Marriott was obligated to factor them, more so, after Starwood acquisition. Despite that, it delayed decommissioning of existing Starwood systems.

Key takeaways:

  • Acquirer is liable as controller post M&A deals.
  • Thorough due diligence into information security processes is relevant consideration for M&A deals.
  • Identified lapses could be potential red flags, and evaluate the need to add suitable covenants for pre-closing steps and indemnity.

Conclusion:

GDPR has been a bench mark for data protection laws across the globe, including India’s draft Personal Data Protection Bill, 2019. Precedents from EU ought to help companies across the globe understand and implement best practices in connection with data processing and security. In fact, once the Indian Data Protection Authority is a setup, a lot of its internal training is likely to be based on GDPR precedents. Therefore, it becomes extremely critical for company policies to constantly evolve by keeping track of latest jurisprudence and understand why some of the more evolved and sophisticated data protection authorities have discouraged certain practices and fined companies.

Authors acknowledge the initial research work done by Resham Jain

[1] Decision in CR litigation No. 430810 available at https://www.conseil-etat.fr/ressources/decisions-contentieuses/dernieres-decisions-importantes/conseil-d-etat-19-juin-2020-sanction-infligee-a-google-par-la-cnil (last accessed on July 26, 2021)

[2] Press release available at https://datenschutz-hamburg.de/assets/pdf/2020-10-01-press-release-h+m-fine.pdf (last accessed on July 26, 2021)

[3] Corrective and sanctioning measures against TIM SpA [9256486] available at https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9256486 (last accessed on July 26, 2021)

[4] ICO Penalty Notice in Case Ref COM0783542 available at https://ico.org.uk/media/action-weve-taken/mpns/2618421/ba-penalty-20201016.pdf (last accessed on July 26, 2021); Prior to BREXIT, ICO investigated on behalf of other EU supervisory authorities under GDPR and the penalty has been approved by them through GDPR’s cooperation process.

[5] The penalty amount is GBP 20 million, which is about EUR 23,398,768 (EUR 1 = about GBP 0.85)

[6] ICO Penalty Notice in Case Ref COM0804337 available at  https://ico.org.uk/media/action-weve-taken/mpns/2618524/marriott-international-inc-mpn-20201030.pdf (last accessed on July 26, 2021)

[7] The penalty amount is GBP 18.4 million, which is about EUR 21,517,052 (EUR 1 = about GBP 0.85)

We are using cookies to give you the best experience. You can find out more about which cookies we are using or switch them off in privacy settings.
AcceptPrivacy Settings

GDPR

 

DISCLAIMER

The Bar Council of India restricts advocates from maintaining a website as a source of advertising. This site contains general information for informative purposes only. The reader should not consider / construe information on this site to be an invitation for any attorney-client relationship.