Fact-checked by Grok 2 weeks ago

End user

An end user is the individual or that ultimately utilizes a product, , or after its and distribution, distinct from intermediaries involved in production or resale. In the context of and , end users are the people who directly interact with software applications, devices, or digital services to perform tasks, without participating in their design or programming. The term, first attested around , emphasizes the final point of consumption, highlighting the importance of user needs in product to ensure and effectiveness. End users play a critical role in fields like and , where systems are tailored to empower non-experts in creating or customizing applications, thereby increasing productivity and reducing reliance on IT specialists. Unlike customers who may purchase for others, end users provide direct through usage patterns, informing iterative improvements based on real-world application rather than theoretical assumptions. This focus on empirical user behavior underscores causal mechanisms in technology adoption, where intuitive interfaces drive sustained engagement over complex alternatives.

Definition and Historical Development

Core Definition and Distinctions

An end user refers to the individual or entity that directly consumes or operates a completed product, such as software applications, devices, or services, for its primary functional purpose without engaging in its creation, customization at the level, or infrastructural oversight. This role emphasizes practical interaction with the final, user-facing to achieve specific tasks, distinguishing it from technical professions by prioritizing empirical utility over expertise in underlying architectures. In contrast to software developers, who architect and implement the core logic and features of systems, or system administrators, who manage deployment, , and of the supporting , end users lack or tools to modify foundational elements and instead rely on pre-packaged outputs tailored for . For example, a utilizing a app to transfer funds exemplifies an end user, whereas the coding the algorithms or the configuring clusters for uptime represent distinct roles focused on and rather than terminal consumption. Intermediate participants, such as testers who provide feedback during refinement, bridge these but do not constitute end users, as their involvement aids development rather than final utilization. End users generate causal demand for products through their adoption and usage patterns, which empirically guide iterative improvements by revealing real-world needs and pain points, yet this influence is indirect and constrained by their exclusion from systemic controls, fostering dependency on designers for reliability, updates, and adaptation to evolving requirements.

Origins and Evolution in Computing History

In the 1960s and 1970s, centered on large mainframe systems such as the , announced on April 7, 1964, which supported where users submitted jobs via punch cards or tapes for execution by centralized operators, rendering end users largely passive recipients of processed outputs rather than direct interactors. This era's architecture prioritized efficiency for organizations over individual agency, with users accessing results through intermediaries, as computing resources were expensive and scarce, typically confined to data centers. The personal computing revolution began in the late 1970s and accelerated in the 1980s, marking the emergence of the end user as an active participant. Software like , released on October 17, 1979, for the , introduced that enabled non-programmers to manipulate data interactively without coding, selling over 100,000 copies rapidly and demonstrating demand for user-friendly tools. The Personal Computer's launch on August 12, 1981, followed by the Apple Macintosh on January 24, 1984, democratized access by providing affordable, standalone machines with graphical interfaces, allowing individuals to own and operate systems independently of specialists. , released January 26, 1983, built on this by integrating , database, and graphics functions into a single package for PCs, further empowering business users to perform complex analyses autonomously. By the 1990s, end-user programming solidified through features like macros in , first released September 30, 1985, for Macintosh and later Windows, permitting customization via recorded scripts that extended beyond predefined functions. household penetration in the United States rose from approximately 15% in 1989 to over 50% by 2000, reflecting widespread adoption that entrenched the end user role. Into the , web applications and mobile devices, exemplified by the iPhone's release on June 29, 2007, shifted emphasis to intuitive interfaces and touch-based interaction, further decentralizing computing from institutional control to personal devices. This progression from mediated mainframe access to direct, programmable personal tools defined the end user's evolution as the primary agent in computing ecosystems.

Role and Responsibilities in Systems

Differentiation from Developers and Administrators

End users in ecosystems are distinguished from developers and administrators by their limited technical oversight and task-oriented engagement, which contrasts with the former's emphasis on software creation and the latter's focus on infrastructure management. Developers concentrate on writing code, designing architectures, and implementing features to build applications, often iterating through testing and to achieve functional goals. System administrators, meanwhile, prioritize deploying software, , resources, and maintaining across servers, networks, and to ensure reliable . For example, while a sysadmin might configure load balancers to spikes, an end user simply runs end-point applications like word processors or browsers without configuring underlying systems. This role separation arises from end users' prioritization of productivity over technical depth, frequently resulting in improvised interactions that bypass optimization protocols. Office employees, for instance, may utilize software for routine correspondence without grasping transport protocols, leading to patterns like habitual password resets or unintended overwrites that strain resources. Such behaviors create a causal pathway to instability, as aggregated user-level missteps amplify demands on shared without the mitigating expertise developers or administrators apply. Structural analysis reveals why end users warrant distinct handling: their actions generate disproportionate support burdens, with surveys showing that for % of organizations, employee-initiated trouble tickets comprise more than half of volume, directly linking user interactions to elevated operational loads. Developers and administrators, equipped with domain-specific tools and foresight, preempt issues through code reviews or proactive monitoring, whereas end-user errors necessitate layered safeguards like simplified interfaces and automated recovery to preserve overall . This delineation supports targeted interventions, such as user-centric , to decouple task execution from systemic risks.

Expectations and Behaviors in Practice

End users in systems are generally expected by organizations to adhere to policies, such as promptly applying software updates, employing unique passwords across accounts, and exercising caution against unsolicited communications to mitigate risks like infection. However, empirical data reveals frequent deviations, with resistance to updates stemming from concerns over disruptions to ; for instance, a 2023 survey by found that 67% of end users delay or avoid updates due to perceived interference with daily tasks. This behavior aligns with users prioritizing short-term convenience over long-term stability. Password reuse remains prevalent among end users, despite organizational mandates for strong, unique credentials, as evidenced by a 2022 Google study indicating that 52% of users recycle passwords across personal and work accounts, increasing vulnerability to credential-stuffing attacks. Similarly, susceptibility to phishing is common, with the Verizon 2024 Data Breach Investigations Report (DBIR) attributing miscellaneous errors, including phishing susceptibility, to involvement in 74% of breaches analyzed, underscoring how end users often click links or attachments without verification when they promise immediate utility or urgency. Organizations anticipate end users to perform basic , such as restarting devices or checking connections before escalating issues, to reduce support burdens; yet, reality shows many users bypass these steps in favor of simplicity, leading to inefficiencies. From a first-principles perspective, end users rationally optimize for immediate and minimal , often creating tensions with systemic requirements for security and maintenance. Supporting this, research from the demonstrates that intuitive user interfaces can yield 20-30% gains in task completion efficiency, highlighting how misalignments arise when systems demand behaviors counter to users' utility-maximizing instincts rather than designing for them.

Empowerment Through Technology

Mechanisms of User Empowerment

Spreadsheets represent a foundational mechanism for end-user empowerment, originating with in 1979, which allowed non-programmers to perform complex calculations through grid-based interfaces without writing code. This tool abstracted mathematical operations into intuitive cells, enabling business users to model financial scenarios and data relationships directly. , released in 1985 for Macintosh and 1987 for Windows, further democratized this capability by integrating graphical user interfaces and formula automation, used by over 1.2 billion people worldwide as of 2023 for data manipulation. Macros and scripting extended spreadsheet functionality, permitting end users to automate repetitive tasks via recorded actions or simple code. (VBA), introduced in 5.0 in 1993, provided a structured embedded within familiar tools, allowing users to create custom functions and workflows without full programming expertise. VBA's event-driven model and integration with Office applications enabled procedural logic, such as looping through datasets or generating reports, layered atop the spreadsheet's visual paradigm to handle complexity incrementally. The progression to low-code and no-code platforms builds on these abstractions through visual development environments. Microsoft Power Apps, launched in late 2015, exemplifies this by offering canvas-based app building with connectors to data sources, where users assemble logic via pre-built components rather than imperative code. No-code variants emphasize drag-and-drop interfaces for elements and workflows, reducing barriers by encapsulating backend services like databases and into configurable blocks. These platforms rely on metadata-driven architectures that generate underlying code automatically, preserving end-user agency while concealing implementation details. forecasts that 70% of new organizational applications will leverage low-code or no-code technologies by 2025, up from less than 25% in 2020, driven by such layers that facilitate without deep technical knowledge.

Achievements in Accessibility and Innovation

The introduction of intuitive spreadsheet software like Microsoft Excel in 1985 marked a pivotal achievement in end-user accessibility, allowing non-technical users to conduct sophisticated data analysis and financial modeling without relying on programmers or mainframe systems. Released initially for the Macintosh on September 30, 1985, Excel provided features such as dynamic formulas, charting, and what-if analysis, enabling business professionals to prototype financial models and generate insights independently, which accelerated decision-making in sectors like finance and operations. This democratization extended to broader innovation through citizen development, where end users leverage low-code and no-code platforms to build custom applications, bypassing traditional IT bottlenecks. forecasts that by 2025, 70% of new enterprise applications will utilize no-code or low-code technologies, with citizen developers—full-time employees outside IT—playing a central role in this shift, contributing to and tailored solutions in areas like and . Such platforms have enabled end users to develop an average of 13 applications each, predominantly web-based, enhancing organizational agility without extensive coding expertise. Empirical evidence underscores productivity gains from these tools; studies show that intuitive interfaces in increase end-user output, with 75% of users reporting higher efficiency and up to 40% reductions in training time due to streamlined interactions. In s specifically, end-user innovations like custom macros and model-driven approaches have empirically improved task completion speeds in data-heavy environments, as demonstrated in controlled studies comparing traditional versus intuitive paradigms. Skilled end users have further driven via open-source contributions, developing extensions, plugins, and user-specific modifications that address niche requirements and propagate improvements across communities. For instance, end-user networks in projects like / and have originated practical enhancements, such as customized tools for , exemplifying how user-led adaptations fuel iterative advancements in software ecosystems. These efforts highlight causal links between accessible tools and tangible outputs, including faster innovation cycles in collaborative environments.

Criticisms and Limitations of Empowerment

While end-user empowerment through accessible tools enables , it has empirically fostered the proliferation of —unauthorized applications and workflows created by non-experts—which often results in inefficient and poorly integrated custom solutions. Surveys indicate that constitutes more than half of daily software usage in over half of surveyed companies, bypassing centralized and leading to duplicated efforts and resource waste. These ad-hoc systems, such as unauthorized tools or makeshift integrations, introduce operational redundancies that elevate costs without delivering scalable efficiency, as end users prioritize immediate needs over long-term architectural coherence. Cognitive limitations among end users exacerbate these issues, with systematic biases contributing to high error rates in user-generated artifacts like spreadsheets and simple scripts. A 2024 analysis of business spreadsheets revealed that 94% contain critical errors, often stemming from overconfidence in intuitive modeling or in formula validation, which propagate inaccuracies in . Similarly, end-user programming tasks are prone to anchoring effects, where initial assumptions rigidly shape subsequent logic, resulting in fragile code that fails under edge conditions unforeseen by untrained creators. Such patterns underscore how , absent rigorous training, amplifies tendencies documented in , with error rates in complex tasks hovering around 2-5% per cell or decision point. This over-reliance on user autonomy undermines professional oversight, as IT specialists' expertise in verification and standardization is circumvented, yielding ecosystems of interdependent yet unvetted components that heighten overall system fragility. Empirical observations show that unchecked custom solutions complicate maintenance and auditing, fostering a false narrative of universal user competence despite evidence of persistent defects in non-professional outputs. Consequently, organizations face elevated risks of cascading failures from these brittle constructs, challenging the assumption that broader empowerment inherently enhances reliability without corresponding accountability mechanisms.

Support and Documentation Practices

Essential Components of End-User Support

End-user support relies on a suite of foundational elements to enable effective system utilization, particularly through documentation and tools that address discrepancies in user proficiency without presuming proactive engagement. Core components include user manuals, which offer detailed operational instructions and protocols, and frequently asked questions (FAQs), compiling common queries with concise resolutions to facilitate rapid reference. These resources form the bedrock of , allowing users to navigate interfaces independently. Interactive assistance mechanisms, such as tooltips and inline help, provide immediate, context-aware guidance embedded within software environments. For example, applications like utilize callout dialogs to elucidate features and settings, reducing by delivering explanations at the point of need. Help desks complement these by offering human-mediated support for escalated or nuanced issues, typically via ticketing systems or direct channels, ensuring comprehensive coverage across varying complexity levels. Self-service prioritization underpins efficiency in end-user , as knowledge bases and searchable repositories empower resolution of routine problems without agent intervention. Implementing robust strategies can deflect 30-50% of tickets, substantially curtailing staffing demands and operational costs. This structure acknowledges the spectrum of end-user expertise, supplying explicit directives to mitigate errors arising from incomplete understanding, thereby optimizing and minimizing .

Evolution and Best Practices in Documentation

End-user documentation began with printed manuals in the , providing physical guides for operating early computer systems like mainframes and minicomputers, where users relied on detailed paper instructions for and . These formats offered comprehensive but static , limited by costs, distribution challenges, and absence of indexing tools, often resulting in user frustration during complex tasks. By the 1990s, the rise of personal computing and the shifted documentation toward digital formats, including HTML-based help systems and searchable PDF files integrated into software installations, enabling easier updates and keyword searches. The early saw the influence of wiki technologies, following the 2001 launch of , which inspired open-editable platforms for , allowing community contributions and real-time revisions in projects like open-source repositories. Post-2020 advancements incorporated AI-driven tools, such as generative chatbots and automated query responders, transforming static guides into interactive, context-aware assistants that provide tailored explanations and reduce navigation time for end users. Best practices in contemporary end-user documentation prioritize clarity and conciseness, employing structured formats with step-by-step instructions, screenshots, and video embeds to minimize ambiguity and . systems, adapted from code management tools like , ensure traceability of changes and accessibility of historical versions, facilitating maintenance in agile environments. Empirical evidence from analyses shows that high-quality, detailed documentation correlates with reduced user reliance on external support, as resources lower helpdesk interactions and associated error resolutions. Overly verbose or jargon-heavy documentation has been criticized for alienating novice users, increasing error rates through , with studies recommending example-driven approaches that focus on common scenarios over exhaustive theoretical coverage. This shift toward verifiable, practical content—supported by —avoids ideological padding, emphasizing causal links between precise guidance and effective user outcomes, such as fewer misconfigurations in .

Security Considerations

Vulnerabilities Stemming from End-User Actions

End-user actions represent a primary source of cybersecurity vulnerabilities, as individuals often prioritize convenience or overlook risks in daily tasks, distinct from flaws in or administrative oversights. These behaviors enable attackers to exploit human psychology rather than technical weaknesses alone, with empirical data indicating that stolen or weak credentials—frequently resulting from user choices—served as the initial in 19% of breaches analyzed in 2023. Similarly, attacks succeed due to users' responses to deceptive prompts, capitalizing on urgency or curiosity without requiring sophisticated code exploits. Susceptibility to exemplifies how end-user haste and lack of vigilance create entry points, as attackers craft messages inducing rapid clicks on malicious links or attachments, bypassing other defenses. Proofpoint identifies social engineering, including , as leveraging emotions like fear to prompt actions that 95% of surveyed security professionals link to in broader contexts, though precise attribution varies by incident type. Weak practices compound this, with users selecting easily guessable or reused credentials; statistics show that 60% of individuals reuse passwords across accounts, facilitating attacks where a single compromise cascades. Failure to apply software patches further stems from user inaction, leaving systems exposed to known exploits that persist due to deferred updates rather than undiscovered developer errors. Causal analysis reveals these risks arise from predictable human tendencies, such as underestimating low-probability threats or favoring immediate task completion over , independent of systemic incentives. For instance, cognitive shortcuts lead users to ignore in unverified emails, amplifying vulnerabilities in ways unaddressed by code-level fixes. The economic toll underscores prevalence: the global average cost reached $4.88 million in 2024, with user-enabled vectors like and compromised credentials driving a substantial share, per IBM's of over 600 incidents. This contrasts with purely technical failures, as user decisions form the proximal cause in chains where behavioral lapses precede exploitation.

Empirical Evidence of Risks and Real-World Incidents

The human element, including end-user actions such as falling for or misconfigurations, has been a factor in a significant majority of . According to Verizon's 2024 Data Breach Investigations Report, which analyzed over 30,000 incidents and 10,000 confirmed breaches, 68% involved non-malicious human actions, such as errors or social engineering susceptibility, remaining consistent with prior years. This pattern underscores the frequency of user-related contributions across industries, where simple oversights enable initial access or lateral movement by attackers. A prominent example occurred in the 2016 Democratic National Committee (DNC) breach, initiated via spear-phishing. In March 2016, Russian military intelligence operatives sent spoofed emails mimicking security alerts to DNC personnel, including chairman , who clicked a malicious link on March 19, compromising his account and facilitating broader network infiltration. This led to the exfiltration of thousands of emails, leaked via in July 2016, highlighting how individual user responses to deceptive prompts can cascade into organizational compromise. In the realm of supply chain attacks with user involvement, the 2020 SolarWinds Orion breach affected up to 18,000 organizations after end-users routinely updated software with tampered versions. Attackers inserted into legitimate updates between September and December 2020, exploiting trust in vendor releases; once installed by administrators, it enabled persistence and data theft from high-profile targets like U.S. agencies. Propagation relied on users' standard deployment practices without additional verification, amplifying the initial compromise. Shadow IT practices, where end-users deploy unauthorized tools, have similarly contributed to breaches. IBM's analysis indicates that 35% of 2023 breaches involved unmanaged or "shadow" data sources, often stemming from unsanctioned cloud apps or storage, increasing detection and response times by an average of 100 days. Statistics show nearly half of cyberattacks trace to , with associated remediation costs averaging $4.2 million per incident, driven by lack of oversight in user-initiated adoptions. These cases, spanning corporate environments, illustrate recurring patterns of unauthorized user actions bypassing , though individual consumer incidents like personal follow similar mechanics on a smaller scale.

Strategies for Enhancing User Security Awareness

Effective security awareness programs emphasize practical training methods that foster individual responsibility, such as simulated phishing exercises, which expose users to realistic scenarios to build recognition skills. Data from KnowBe4's analysis of over 1,000 organizations indicates that such simulations, combined with remedial training, reduced the phish-prone percentage—a metric of users likely to click links—from a baseline of 33.1% to 4.1% after 12 months, representing an 86% decrease, with initial drops of 40% within three months. This approach prioritizes repeated, low-stakes exposure over one-off lectures, enabling users to internalize threats through direct experience rather than passive instruction. Mandatory (MFA) serves as a foundational , compelling users to adopt verification habits that mitigate credential-based risks without external dependencies. Implementation of MFA has been shown to block over 99% of automated account takeover attempts, as it requires possession of a second factor beyond passwords, training users to verify prompts critically. While bypass techniques exist, such as for one-time codes, consistent user adherence—reinforced through awareness campaigns—amplifies its causal impact on reducing unauthorized access, with studies confirming it as a high-yield for end-users managing or work accounts. Behavioral analytics integrates monitoring of user patterns to enhance awareness proactively, flagging anomalies like unusual times or data access that signal potential compromises or careless habits. User and entity behavior analytics (UEBA) tools analyze deviations from established baselines, alerting individuals to self-correct or escalate issues, thereby cultivating vigilance without paternalistic oversight. This method supports by providing actionable feedback, such as notifications of risky behaviors, which empirical deployments show improve threat detection by identifying insider errors early. Complementing these, (EDR) technologies empower users by automating alerts on device-level threats, encouraging habitual checks like software updates and safe browsing. Comprehensive awareness initiatives incorporating these elements deliver measurable returns; for instance, effective programs correlate with up to 70% reductions in security risks and make organizations 8.3 times less likely to suffer , per vendor-analyzed datasets, underscoring the value of user-centric, evidence-driven over regulatory crutches. ROI calculations from such programs often exceed 100%, as avoided incidents offset costs, with one model estimating $138,000 annual savings per organization from diminished breach probabilities.

Liability Frameworks for End-User Conduct

End-user liability for misconduct in computing environments primarily arises under tort law principles of , where users fail to exercise reasonable care in handling systems, leading to foreseeable harm such as data or unauthorized disclosures. For instance, an end user who negligently shares credentials or ignores security warnings may be held accountable in civil actions for resulting , as tort doctrine requires proving of a , causation, and injury. Contractual agreements, including end-user agreements (EULAs) and , further reinforce user responsibility by stipulating compliance with usage policies, often limiting recourse against providers while imposing penalties for user violations. In the United States, the Computer Fraud and Abuse Act (CFAA), enacted in 1986, addresses certain end-user conduct involving unauthorized computer access or exceeding authorized access, potentially leading to criminal or civil liability for actions like intentional data exfiltration by employees. However, judicial interpretations have narrowed CFAA applicability; for example, the Third Circuit in 2025 ruled that mere violations of employer computer-use policies, without evidence of technical circumvention like hacking, do not constitute CFAA offenses, rendering criminal prosecutions rare absent clear unauthorized entry. Empirical data supports this infrequency: while insider threats account for about 34% of breaches per Verizon's 2024 report, CFAA convictions against non-hacking users remain exceptional, with most resolutions handled civilly through negligence claims or employment disputes rather than federal prosecution. This user-focused accountability contrasts sharply with developer liability under doctrines, where providers face claims only for defective or flaws, not user errors in operation. Courts distinguish misuse—attributable to the end —from inherent product defects; for example, a 2018 analysis noted that robust licensing agreements shield developers from for user-induced harms, shifting the burden to the 's negligent conduct. Such frameworks incentivize user caution by personalizing , though critics argue they may deter adoption of complex tools by imposing undue individual burdens without proportionate enforcement. Post-2023 developments in AI applications have highlighted rising user accountability for misuse, with cases emphasizing personal responsibility over tool-provider fault. In Mata v. Avianca, Inc. (S.D.N.Y. 2023), attorneys faced sanctions for submitting fictitious case citations generated by without verification, underscoring negligence in relying on unvetted AI outputs. Similar incidents, including UK judicial warnings in 2025 against AI-generated fabrications in filings, indicate a trend toward professional discipline and potential liability for users who fail to mitigate foreseeable AI errors, though criminal cases remain scarce. Internationally, liability frameworks exhibit variances, with the imposing stricter standards on user through national codes and directives like the NIS2 (effective ), which extend to individuals in critical sectors for lapses contributing to systemic risks. Unlike the U.S. emphasis on contractual and CFAA boundaries, approaches integrate with broader regulatory duties, potentially heightening user exposure in cross-border scenarios, as seen in fines for willful mishandling under aligned laws. These regimes collectively underscore end-user agency, countering tendencies to externalize blame to technology providers while balancing with .

Regulatory Impacts on User Autonomy and Privacy

The European Union's (GDPR), effective May 25, 2018, mandates explicit user for , aiming to bolster individual control over personal information. However, empirical analyses reveal that this framework often overwhelms users, fostering "consent fatigue" where repeated prompts lead to habitual acceptance without genuine comprehension or deliberation. Studies indicate that such fatigue undermines the regulation's goal of informed , as users increasingly default to approving terms to access services, paradoxically weakening effective . Similarly, the (CCPA), enacted in 2018 and operative from January 1, 2020, grants users rights to opt out of data sales, intending to enhance privacy agency. Yet, burdens have correlated with diminished service availability, as firms face heightened operational costs and legal risks, reducing innovation in data-driven applications. from post-GDPR and analogous CCPA contexts shows a decline in digital service supply within regulated jurisdictions, with funding for tech startups dropping by up to 20% in affected regions due to compliance hurdles. This contraction limits user options, as smaller providers exit markets unable to absorb regulatory overhead, thereby curtailing practical autonomy despite formal empowerment mechanisms. Debates over self-management highlight inherent flaws in consent-centric models, where users' cognitive limits and information overload preclude robust decision-making, as evidenced by surveys documenting widespread app uninstallations—up to 72%—attributed to intrusive interfaces mandated by such laws. Proposals like the U.S. , reintroduced in 2023, exemplify tensions with , conditioning liability protections on scanning for illicit content, which pressures providers to weaken cryptographic safeguards under the pretext of child safety. Such measures risk normalizing proactive by intermediaries, eroding user in favor of state-mandated interventions that prioritize collective risk mitigation over individual tools. Evidence suggests these regulatory impulses favor top-down controls, diminishing the efficacy of user-deployed protections like strong , which empirical research upholds as superior for preserving against both private and governmental overreach.

Debates Over Government and Corporate Overreach

Debates over government-mandated access to encrypted communications have intensified in the , with proponents arguing that "lawful access" mechanisms are essential for to combat child exploitation and . For instance, the U.S. , reintroduced in 2022, seeks to hold tech providers liable for failing to detect (CSAM), potentially pressuring companies to weaken or scan user data preemptively. Advocates, including some lawmakers, claim such measures enhance public safety without creating universal backdoors, as access would require warrants. However, critics from organizations like the () contend that no technically feasible "responsible" backdoor exists, as any weakening of exposes all end users to risks from adversaries, including foreign states, undermining the very these policies aim to protect. Empirical evidence supports skepticism toward these interventions; post-Edward revelations in 2013, public awareness of U.S. programs reached 87% by 2015, prompting 22% of Americans to increase privacy protections like using tools, yet trust in government handling of remained low at 35%. practitioners echo this, noting that historical proposals like the 1990s failed due to inevitable key compromises, and modern equivalents would similarly erode end-user confidence in digital tools essential for everyday computing. In the UK, 2025 proposals for scanning encrypted messages have drawn warnings that they jeopardize national cybersecurity standards, as weakened protocols invite exploitation beyond intended use. Corporate practices exacerbate these concerns through extensive justified under opaque privacy policies, often collecting user behavioral far exceeding service needs, which fuels debates on surveillance capitalism. A 2024 report highlighted how major platforms like and aggregate vast personal datasets for advertising, with end users facing limited practical control despite options, leading to incidents like the 2021 breach affecting 87 million users. Proponents of tighter corporate regulation cite safety benefits, such as improved , but detractors argue it stifles innovation and user autonomy, as firms respond by or layering costs that reduce service for non-expert end users. Critiques from liberty-oriented perspectives emphasize that both and corporate overreach cultivate user dependency on centralized systems, discouraging self-reliant security practices like personal management. For example, mandatory lawful proposals implicitly prioritize state over individual rights, while corporate hoarding—estimated to include 50-90% "dark" unused —creates vulnerabilities that governments later exploit under emergency pretexts, as seen in expansions of laws. This dynamic, unmoored from first-hand evidence of net safety gains, risks normalizing controls that diminish end-user agency in environments, with empirical trust erosion persisting since without corresponding reductions in crime via such measures.

Integration of AI and Low-Code Platforms

The integration of (AI) with low-code platforms has significantly expanded capabilities in the 2020s, enabling non-technical users to create applications through natural language prompts and visual interfaces rather than traditional programming. Following the release of in November 2022, AI assistants have facilitated for tasks ranging from simple scripts to complex algorithms, allowing end-users to prototype software rapidly without extensive coding expertise. Low-code platforms, such as , complement this by providing drag-and-drop tools that claim to accelerate development by up to 10 times compared to conventional methods, democratizing app creation for business users. Adoption of these technologies has surged, with forecasting that 70% of new enterprise applications will utilize low-code or no-code methods by 2025, up from less than 25% in 2020, driven by demands for faster . However, this empowerment introduces shadow AI risks, where end-users deploy unauthorized tools outside IT oversight, potentially leading to data leakage, compliance violations under regulations like GDPR, and exposure of sensitive information to external models. Surveys indicate that up to 80% of organizations exhibit unapproved activity, amplifying vulnerabilities such as prompt injection attacks or biased outputs in user-built applications. Empirical studies on reveal productivity gains tempered by limitations. Microsoft-backed trials reported a 21% boost in complex knowledge work via assistance, including code-related tasks, while some self-reports noted 6.5% time savings. Conversely, a 2025 randomized controlled trial found experienced open-source developers 19% slower when using early-2025 tools, due to increased time spent reviewing and outputs. Error rates remain a concern, with at least 48% of AI-generated code containing security vulnerabilities, and rates exceeding 70% for languages like in user applications. These findings underscore that while AI-low-code integration enhances end-user agility, it necessitates rigorous validation to mitigate defects and risks in production environments.

Shifts Toward Cloud and Virtual End-User Computing

The transition to cloud-based and virtual end-user computing has accelerated since the early 2020s, driven by the need for flexible infrastructure. (AVD), rebranded from Windows Virtual Desktop in June 2021 following its general availability in October 2019, enables organizations to deliver virtualized Windows desktops and applications hosted in cloud data centers. Complementing this, launched Windows 365 in August 2021 as a cloud PC service, providing per-user virtual machines accessible from any device without local hardware management. These solutions represent a shift from on-premises virtual desktop infrastructure (VDI) to desktop-as-a-service (DaaS) models, with forecasting that 60% of enterprises will rely on remote access services and virtualized workspaces by the end of 2025 to support operational agility. DaaS market spending is projected to grow from $4.3 billion in 2025 to $6.0 billion by 2029, reflecting a of approximately 9%. Centralized management in these cloud VDI environments reduces dependency on end-user by hosting desktops on scalable cloud resources, allowing administrators to apply patches, updates, and policies across fleets simultaneously. This approach enhances digital employee experience (DEX) metrics, such as user and satisfaction, by ensuring consistent access to resources regardless of devices. Organizations report cost savings of 30-40% in desktop management and expenditures through resource pooling and elimination of physical PC refreshes, though initial implementation requires optimizing sizing to avoid overprovisioning. Despite these advantages, remains a challenge in scenarios, where high network delays—often exceeding 100-150 ms—can degrade performance for graphics-intensive tasks or interactions, necessitating edge caching or protocol optimizations like those in AVD. Looking to trends, these shifts enable greater for hybrid workforces, with auto-scaling features in platforms like Windows 365 supporting dynamic to handle fluctuating demand without upfront capital outlays, potentially reducing costs by up to 30% for expanding enterprises. This positions cloud as a foundational layer for resilient, device-agnostic operations amid rising remote adoption.

Emerging Controversies in Shadow IT and AI Liability

, encompassing end-user adoption of unapproved software and services, continues to provoke debates over organizational control versus individual productivity gains, with recent analyses indicating it constitutes 30-40% of IT spending in large enterprises. This prevalence heightens breach risks, as unauthorized tools often lack vetting, leading to vulnerabilities like ; for instance, 11% of global cyber incidents in 2024 were linked to such usage. Proponents argue that curbing stifles , citing surveys where 80% of employees adopt these tools for efficiency, yet critics highlight of escalating exposures, particularly in work environments post-2022. The integration of AI has intensified these tensions through "shadow AI," where end-users deploy generative models without oversight, exemplified by ChatGPT emerging as the leading offender in shadow IT rankings by mid-2024. Corporate data fed into such tools surged 485% from March 2023 to March 2024, amplifying risks of sensitive information leakage and compliance violations under frameworks like GDPR or HIPAA. This user-driven proliferation underscores a core controversy: while AI enhances task , unmonitored deployments introduce injection attacks and , with 2024 marking a record year for AI-facilitated exfiltration in sectors like and healthcare. AI liability debates center on apportioning for harms arising from opaque, "black-box" systems prompted by end-users, as explored in Yale analyses questioning whether fault lies with the tool's owner, deployer, or original . In user-initiated errors—such as biased outputs or erroneous decisions—legal scholars contend that end-user complicates traditional standards, potentially shifting burdens to organizations despite individual ; this view contrasts with provider defenses emphasizing user input as the causal factor. Empirical cases, including 2024 voice-based incidents rising 3,000% from prior years, illustrate how shadow AI evades accountability, fueling calls for explicit regimes over blanket prohibitions. Looking ahead, controversies pivot toward decentralized AI architectures, which promise greater end-user autonomy by distributing computation across networks resistant to centralized oversight, potentially mitigating risks through peer-verified tools. Yet, this trajectory clashes with regulatory momentum, such as the EU AI Act's risk-based classifications effective from 2024, which aim to impose governance on high-impact systems but struggle against borderless decentralized models. Advocates for decentralization argue it fosters causal transparency via blockchain-augmented AI, countering black-box opacity, though skeptics warn of amplified liability diffusion in ecosystems, where user prompts could trigger untraceable harms amid fragmented enforcement.

References

  1. [1]
    End User vs. Customer: Definitions and Key Differences - Investopedia
    The term "end user" refers to the consumer of a good or service, often with unique insights or know-how gained from using the product.
  2. [2]
    What is an End User? Definition, Types, Examples and Management ...
    End users are individuals who interact directly with the final software application, hardware device, or digital service, utilizing its features and ...
  3. [3]
    What is end-user computing (EUC)? | Definition from TechTarget
    Jun 2, 2023 · EUC is a term that refers to the technologies that IT professionals use to deploy, manage and secure the devices, applications and data that workers require to ...
  4. [4]
    What Is an End User? Definition, Examples and Tips | Indeed.com
    Jul 24, 2025 · An end user is the person a product development team has in mind when they design a product. They differ from customers because they can't resell the product.
  5. [5]
    End-Users - an overview | ScienceDirect Topics
    End users are defined as the majority of personal computer users who utilize available software for various tasks, distinguishing them from computer ...
  6. [6]
    What is the difference between a user and an end-user?
    Jun 22, 2022 · End users are the users of an end product or service, while users are all the users and stakeholders who interact with the different phases of such a product ...
  7. [7]
    What is the difference between end-users, normal users, power ...
    Feb 26, 2023 · End-users and normal users are one and the same. These are the average users of computing systems. They usually use the desktop or laptop they are given.What is difference between admin and developer in software industry?What are the differences between A System Administrator ... - QuoraMore results from www.quora.com
  8. [8]
    Who are the End Users? Definitions, Functions, Roles, and Examples
    Jul 3, 2024 · End users are the ultimate consumers or operators of a product, system, or service. They interact directly with the final product to achieve their specific ...
  9. [9]
    The IBM System/360
    The IBM System/360, introduced in 1964, ushered in a new era of compatibility in which computers were no longer thought of as collections of individual ...
  10. [10]
    VisiCalc - Apple II Software - The Centre for Computing History
    VisiCalc was the first spreadsheet program for personal computers, released for Apple II in 1979. It was considered Apple II's "killer app" and required 32K of ...
  11. [11]
    The IBM PC
    On August 12, 1981, Estridge unveiled the IBM PC at New York's Waldorf Hotel. Priced at USD 1,565, it had 16 kilobytes of RAM and no disk drive, and it came ...Overview · Inspiration
  12. [12]
    Apple Macintosh Original (128k) Specs - EveryMac.com
    January 24, 1984*, Discontinued Date: October 1, 1985. Details: The "Introduction Date" refers to the date a model was introduced via press release. The ...<|separator|>
  13. [13]
    Today in Media History: Lotus 1-2-3 was the killer app of 1983
    Jan 26, 2015 · Lotus 1-2-3's original release date was 32 years ago today, on January 26, 1983. Another spreadsheet you may have heard of, Microsoft's Excel, ...
  14. [14]
    Microsoft Excel | Description & History | Britannica
    Sep 26, 2025 · Microsoft developed a competing spreadsheet, and the first version of Excel was released in 1985 for Apple Inc.'s Macintosh computer. Featuring ...
  15. [15]
    Attitudes Toward Computers Across Adulthood From 1994 to 2013
    In 1989, about 15% of all households in the U.S. owned a personal computer. In contrast, currently 75% of U.S. households have a desktop or laptop; 77% of ...
  16. [16]
    On this day in history, June 29, 2007, the first iPhone goes on sale
    Jun 29, 2024 · On this day in history, June 29, 2007, the first Apple iPhone was released, changing the smartphone space forever. The phone sold for $600 ...
  17. [17]
    Which Career to Choose: Programmer Vs System Administrator
    Jul 18, 2023 · In this tutorial, we explore differences between Programmers and System administrators, more specifically in terms of the skillsets.
  18. [18]
    SysAdmin vs Developer – SWL - Software League
    Dec 3, 2024 · SysAdmins focus on the infrastructure and ensuring systems run smoothly, while Developers create the software and applications that utilize that infrastructure.
  19. [19]
    IT Administrator or Software Developer - Atera's Blog
    Aug 13, 2024 · Software developers build and manage software, while IT admins support machines, build networks, and troubleshoot. Developers focus on creative ...
  20. [20]
    The Troubling Trend in Trouble Tickets | NETSCOUT
    May 30, 2024 · In 2021, 42 percent of respondents indicated that trouble tickets represented more than 50 percent of the issues reported by employees to help desks.Missing: Gartner | Show results with:Gartner<|separator|>
  21. [21]
    The History of Spreadsheets - Subset Blog
    Feb 8, 2023 · The first electronic spreadsheet, VisiCalc, was developed in the late 1970s. Lotus 1-2-3, released in 1983, became the first massively popular spreadsheet ...
  22. [22]
    The history of spreadsheets - Sheetgo
    Spreadsheets started as paper ledgers, then evolved to digital programs like VisCalc, and later GUI programs like Excel, with Microsoft Excel becoming dominant.
  23. [23]
    An Ode to Excel: 34 Years of Magic
    Aug 25, 2019 · Excel is a successful, long-lasting tool that made complex analysis accessible, changing how people run their lives and businesses. It is a ...
  24. [24]
    A Brief History of VBA - VB & VBA in a Nutshell: The Language [Book]
    1993—VBA launched with Microsoft Excel. VBA first saw the light of day as a replacement for the macro language in Microsoft Excel. Its power and ...
  25. [25]
    Where Did It All Begin? A Look at the Turbulent History of VBA and ...
    Sep 10, 2025 · Microsoft first introduced VBA with Excel version 5 in 1993. This was the beginning of a quiet revolution—the entry of a powerful programming ...
  26. [26]
    When Was Power Apps Launched & Is It Going Away? | CloudJoy
    How Long Has PowerApps Been Around? PowerApps was launched by Microsoft in late 2015. So, as of 2023, it has been around for nearly 8 years.
  27. [27]
    What Is No Code? | IBM
    Using a no-code platform, they can assemble the app in hours instead of weeks by dragging and dropping modules such as question fields, buttons and data ...
  28. [28]
    A Brief History of Microsoft Excel | Noble Desktop
    Jul 15, 2025 · Microsoft Excel was initially introduced in 1985. Excel 1.0 was one of the first spreadsheet applications that used a graphical interface ...Missing: date | Show results with:date<|separator|>
  29. [29]
    Future of Citizen Development: Unlock your Workplace in 2025 - Quixy
    Gartner estimates by 2025, 70% of new applications developed by enterprises will use no-code or low-code technologies (up from less than 25% in 2020). Gartner ...
  30. [30]
    30+ Low-Code/ No-Code Statistics - Research AIMultiple
    Aug 14, 2025 · 70% of new applications developed by organizations will use low-code or no-code technologies by 2025, up from less than 25% in 2020. · 41% of ...
  31. [31]
    Intuitive Interfaces Enhance Business Intelligence Tools | MoldStud
    Apr 24, 2025 · Research shows that 75% of end-users are more productive when software is easy to use, leading to a reduction in training time by up to 40%.
  32. [32]
    End Users Productivity in Model-based Spreadsheets - ResearchGate
    To improve end-users productivity, recent research proposes the use of a model-driven engineering approach to spreadsheets. In this paper we conduct the first ...Missing: boost | Show results with:boost
  33. [33]
    [PDF] Open Source Software Projects as User Innovation Networks
    Well-known examples of free or open source software are the GNU/Linux computer operating system, Perl programming language, and Internet e- mail engine SendMail ...
  34. [34]
    The Importance of Open Source Contributions by End Users
    Dec 19, 2023 · Open source software has become the foundation of the digital age. It drives innovation, fosters collaboration, and empowers users worldwide.
  35. [35]
    [PDF] The State of Shadow IT in 2023 - Cledara
    In fact, shadow IT accounts for more than half of daily software usage for over half the companies surveyed. Whilst this might sound shocking, it corroborates.
  36. [36]
    What are the pros and cons of shadow IT? - TechTarget
    Mar 5, 2024 · The cons of shadow IT · Rings up extra costs · Increases cybersecurity risks · Puts proprietary and protected data at risk · Introduces performance ...
  37. [37]
    Study finds 94% of business spreadsheets have critical errors
    Aug 13, 2024 · A recent study has found that 94% of spreadsheets used in business decision-making contain errors, posing serious risks for financial losses and operational ...
  38. [38]
    Cognitive biases in user experience and spreadsheet programming
    Feb 1, 2025 · Cognitive biases are systematic patterns of deviation from norm or rationality in judgment, which can significantly impact user experience (UX) ...
  39. [39]
    [PDF] Spreadsheet Errors: What We Know. What We Think We Can Do.
    The Introduction noted that human error rates in complex cognitive tasks tend to be about 2% to 5%, so spreadsheet errors must be fairly frequent or we will ...
  40. [40]
    (PDF) What We Know About Spreadsheet Errors - ResearchGate
    Aug 6, 2025 · We have learned a good deal about the errors that people make when they develop spreadsheets. In general, errors seem to occur in a few percent of all cells.<|separator|>
  41. [41]
    Cognitive Biases in Software Development
    Apr 1, 2022 · Cognitive biases are hardwired behaviors that influence developer actions and can set them on an incorrect course of action, necessitating backtracking.
  42. [42]
    IT Service Desk: 5 Key Components and Their Benefits - DevRev
    May 30, 2025 · Knowledge base and documentation - A comprehensive knowledge base and well-documented procedures are essential resources for the IT Service ...
  43. [43]
    3 Examples of Inline Help – Baymard Institute
    Sep 24, 2010 · 3 Examples of Inline Help ; 1) Google: Callout dialog · Introducing features · Setting a clear context ; 2) Amazon: What's this? · Unobtrusive help ...
  44. [44]
    The key components of a successful IT support desk - Endsight
    Jun 6, 2024 · Key components include customer focus, streamlined communication, knowledge base, proactive issue resolution, team education, and a responsive ...
  45. [45]
    Self-Service Reduces Support Costs While Improving Customer ...
    Dec 11, 2022 · Service directors implementing comprehensive self-service strategies typically see 30-50% reduction in support ticket volume while achieving ...Missing: statistics | Show results with:statistics
  46. [46]
    What is end-user support? - WalkMe
    End-user support is help given to people who use a product or service, helping them solve problems and use it correctly. It is broader than service desks.
  47. [47]
    From Traditional Manuals to Interactive Digital Guides - Amarel-US
    Aug 29, 2024 · This article delves into the history of user manuals, tracing their evolution from traditional paper-based formats to the dynamic, interactive digital guides ...Missing: wikis | Show results with:wikis
  48. [48]
    The Evolution of User Manuals: From Print to Digital - Manuals+
    As technology continued to advance, user manuals evolved further into interactive digital formats. Companies started developing dedicated software applications ...Missing: wikis | Show results with:wikis
  49. [49]
    Technical Documentation in Software Development: Types and T
    Jul 6, 2024 · In this article, we delve into the crucial role technical documentation plays in software development. We'll guide you through the various ...
  50. [50]
    The Evolution of Technical Documentation: AI-Driven Solution
    Dec 19, 2024 · Explore the journey of technical documentation from manual translation to AI-driven solutions. Learn how AI enhances speed, accuracy, ...
  51. [51]
    Best Practices for a User Manual Format: Tips for Clarity and Usability
    Oct 16, 2025 · Explore user manual format best practices to boost clarity and usability. Let's make your manuals user-friendly and effective together!
  52. [52]
    [PDF] The Importance of Software Documentation in the Development and ...
    Apr 7, 2023 · Current research shows that detailed and easy-to-follow documentation decreases end-user reliance on helpdesk service assistance since the users ...<|control11|><|separator|>
  53. [53]
    The Usability of Technical Documentation: An Overview - Archbee
    This article dives deep into this concept, elucidating what exactly usability entails, why it's significant, and its best practices.
  54. [54]
    Evaluating usage and quality of technical software documentation
    One of the results is that, in the context of our case-study, usage of documentation for an implementation purpose is higher than the usage for maintenance ...
  55. [55]
    What Is Social Engineering? - Definition, Types & More | Proofpoint US
    In a social engineering threat, an attacker uses human emotion (usually fear and urgency) to trick the target into performing an action.
  56. [56]
    70+ Password Statistics for 2025 - Spacelift
    Oct 16, 2025 · An automated attack occurs every 39 seconds, 79% mix words/numbers, 60% reuse passwords, 49% of breaches involve weak passwords, and 24 billion ...<|separator|>
  57. [57]
    IBM Report: Escalating Data Breach Disruption Pushes Costs to ...
    Jul 30, 2024 · IBM released its annual Cost of a Data Breach Report revealing the global average cost of a data breach reached $4.88 million in 2024, ...
  58. [58]
    The Human Factor in Cybersecurity - SecurityScorecard
    Feb 16, 2024 · Humans are susceptible to cognitive biases, such as the tendency to prioritize convenience over security or to underestimate risks when they ...
  59. [59]
    [PDF] 2024 Data Breach Investigations Report | Verizon
    May 5, 2024 · For this year's dataset, the human element was a component of 68% of breaches, roughly the same as the previous period described in the 2023 ...
  60. [60]
    How Russians broke into Democrats' email - CNBC
    Jul 16, 2018 · In the Democratic National Committee hack in 2016, those emails were just spoofed to look like security updates from Google, according to ...
  61. [61]
    Grand Jury Indicts 12 Russian Intelligence Officers for Hacking ...
    Jul 13, 2018 · After public accusations that the Russian government was behind the hacking of DNC and DCCC computers, defendants created the fictitious persona ...
  62. [62]
    An Investigative Update of the Cyberattack - SolarWinds Blog
    May 7, 2021 · A deep dive into the SUNBURST attack of 2020. Find out the full insights from the SUNBURST investigation and ongoing safety measures.Missing: propagation | Show results with:propagation
  63. [63]
    [PDF] A Review of the SolarWinds Attack on Orion Platform using ... - arXiv
    In 2020 SolarWinds attack indicates an initial breach disclosure on Orion ... hackers were capable to propagate the attack by using a VMware exploit.
  64. [64]
    Hidden risk of shadow data and shadow AI leads to higher breach ...
    35% of breaches this year involved data stored in unmanaged data sources—aka “shadow data.” This translated into data not being classified properly or at all ...
  65. [65]
    Shadow IT Definition: 2024 Statistics and Solutions - Josys
    Almost half of all cyberattacks are linked to shadow IT, with the average cost of addressing these breaches exceeding $4.2 million. · 83% of IT professionals ...
  66. [66]
    KnowBe4 Report Reveals Security Training Reduces Global ...
    May 13, 2025 · Security training reduced the global Phish-prone Percentage (PPP) by 86% after 12 months, with a 40% drop in 3 months. The baseline PPP was 33. ...
  67. [67]
    Security Awareness Training and Simulated Phishing Work to ...
    Oct 30, 2023 · KnowBe4 analysis confirms security awareness training and simulated phishing significantly reduce organizational cybersecurity risks and ...
  68. [68]
    Why Cyber Insurance Providers Now Require MFA Security?
    Sep 14, 2025 · Studies show MFA can block over 99% of automated account takeovers, making it one of the most effective defenses. Learn more about why MFA is ...
  69. [69]
    Why Stopping Account Takeovers Requires Defense-in-Depth
    Apr 30, 2025 · And 62% of them experienced successful account compromises. Somewhat more unsettling is that 65% of the compromised accounts had MFA enabled.
  70. [70]
    What is User Behavior Analytics? (UBA) - IBM
    User behavior analytics (UBA) uses data analytics and machine learning to track users and detect deviations that might signify security threats.What is user behavior... · UBA versus UEBA
  71. [71]
    What Is Behavioral Analytics? - CrowdStrike
    Jan 16, 2025 · Behavioral analytics focuses on user behavior within networks and applications, watching for unusual activity that may signify a security ...
  72. [72]
    2025 Security Awareness Training Stats and Trends - Keepnet Labs
    Jan 23, 2024 · Cyber security awareness training leads to a 70% reduction in security-related risks in 2023. This fact underscores the significant impact that ...
  73. [73]
    Does security awareness training result in fewer breaches?
    Mar 10, 2025 · The results speak for themselves. Research by KnowBe4 shows that organisations with effective SAT programmes are 8.3 times less likely to ...
  74. [74]
    Calculating the ROI on Security Awareness Training
    Aug 2, 2025 · This ROI formula shows that the return of investment is 137.6%, or about $138,000 each year. This equation clearly shows the value of helping ...<|control11|><|separator|>
  75. [75]
    Computer Torts
    Computer torts include damage to hardware/software, negligent use, privacy violations, and defamation by computer, which can damage reputation.
  76. [76]
    [PDF] The Reasonable Computer: Disrupting the Paradigm of Tort Liability
    Apr 4, 2018 · The article argues that if a computer is safer than a reasonable person, the supplier should be liable in negligence, not strict liability.
  77. [77]
    Why You Probably Don't Have Product Liability for the Software You ...
    Dec 11, 2018 · With a strong license agreement between the user and software developer, it seems that product liability is largely limited, according to Berger ...
  78. [78]
    Third Circuit: Absent Hacking, Violating Employer's Computer-Use ...
    Sep 5, 2025 · The company sued under the CFAA's provisions that prohibit employees from accessing a computer without authorization and exceeding the ...
  79. [79]
    Violations of an Employer's Computer Use Policies Cannot Support ...
    Oct 16, 2025 · The Court concluded that absence evidence of code-base hacking, the CFAA does not “countenance claims premised on a breach of workplace ...
  80. [80]
    Van Buren in Action: Third Circuit Rejects Application of the ...
    Aug 29, 2025 · The ruling thus shields workplace computer-use policy violations by current employees, such as password sharing or improper data use, from CFAA ...
  81. [81]
    Courts Redefining Software As Product Generates New Risks
    Aug 1, 2025 · This distinction has historically shielded software developers and digital platform providers from the strict liability claims that have ...
  82. [82]
    [PDF] NEGLIGENCE AND AI'S HUMAN USERS - Boston University
    It is therefore important to understand how tort law views users of AI, not just its creators. Negligence asks whether a person has violated her duty of.<|control11|><|separator|>
  83. [83]
    [PDF] Fake Cases, Real Consequences: Misuse of ChatGPT Leads to ...
    On June 22, 2023, Judge P. Kevin Castel of the United. States District Court for the Southern District of New York4 in Mata v. Avianca, Inc., imposed ...
  84. [84]
    Federal Court Turns Up the Heat on Attorneys Using ChatGPT for ...
    Aug 13, 2025 · Supp. 3d 443, 448 (S.D.N.Y. 2023), as the leading case on the consequences of misuse of generative artificial intelligence in legal pleadings.
  85. [85]
    Cybersecurity Negligence and Personal Liability: What CISOs and ...
    Cybersecurity Negligence and Personal Liability: What CISOs and Board Members Need to Know ... Personal liability for cybersecurity isn't just a U.S./EU concern.
  86. [86]
    Personal liability: A new trend in cybersecurity compliance? - CIO
    Oct 29, 2024 · ... negligent in overseeing functions related to cybersecurity can be held personally liable. ... or willful negligence. These are also, of course, EU ...
  87. [87]
    Liability for software insecurity: Striking the right balance - IAPP
    May 23, 2023 · While product liability is absolute, it may only apply when insecure software is viewed as defective. An alternate theory of liability is based ...
  88. [88]
    [PDF] ARTICLE - Boston University
    GDPR, the organizations obtaining consent must be able to produce proof of it. ... One way to address consent fatigue might be to increase the scope of consent.
  89. [89]
    The Role of Consent Form Design Under GDPR: A Survey Experiment
    Feb 17, 2024 · This study empirically investigates how to ensure informed consent and protection of personal data without compromising the needs of academic ...
  90. [90]
    California Consumer Privacy Act (CCPA)
    Mar 13, 2024 · Businesses cannot sell or share your personal information after they receive your opt-out request unless you later authorize them to do so again ...CCPA Opt-Out Icon · CCPA Regulations · CCPA Enforcement CaseMissing: abandonment | Show results with:abandonment
  91. [91]
    The impact of the EU General data protection regulation on product ...
    Oct 30, 2023 · This study provides evidence on the likely impacts of the GDPR on innovation. We employ a conditional difference-in-differences research design and estimate ...
  92. [92]
    [PDF] The Impact of EU Data Regulations on Innovation, Competitiveness ...
    service providers. Empirical evidence shows that the. GDPR has reduced the supply of digital services in the. EU, compared to other regions and to the pre-GDPR.
  93. [93]
    More Evidence Emerges That the GDPR Has Inflicted Lasting ...
    May 11, 2022 · A new study provides further evidence that the GDPR has inflicted a lasting blow to the European digital economy.
  94. [94]
    Understanding Why Users Abandon Apps - Twinr
    Jun 12, 2025 · Pew Research discovered 72% of users will uninstall apps because of privacy issues, one of the primary rate app churn reasons. Asking to access ...Missing: regulations | Show results with:regulations
  95. [95]
    Dangerous EARN IT Bill Advances Out of Committee, but Several ...
    May 10, 2023 · If enacted, EARN IT will put massive legal pressure on internet companies both large and small to stop using true end-to-end encryption and ...Missing: threats | Show results with:threats
  96. [96]
    The EARN IT Act Is a Sneak Attack on Encryption - WIRED
    Mar 5, 2020 · The bill threatens to erode established protections against holding tech companies responsible for what people do and say on their platforms.
  97. [97]
    Warrant-Proof Encryption and Lawful Access - FBI
    "Lawful access" refers to law enforcement's ability to obtain evidence and threat information from digital service providers and device manufacturers.Missing: proposals 2020s
  98. [98]
    Lawful Access to Encrypted Data Act 116th Congress (2019-2020)
    This bill requires certain technology companies to ensure that they can decode encrypted information on their services and products.Missing: proposals 2020s
  99. [99]
    Fancy New Terms, Same Old Backdoors: The Encryption Debate in ...
    Dec 26, 2019 · The only innovation in 2019 has been rhetorical—anti-encryption authorities are determined not to call a backdoor a backdoor. Instead, we saw a ...
  100. [100]
    There is No Middle Ground on Encryption
    May 2, 2018 · Opponents of encryption imagine that there is a “middle ground” approach that allows for strong encryption but with “exceptional access” for law enforcement.
  101. [101]
    Americans' Privacy Strategies Post-Snowden - Pew Research Center
    Mar 16, 2015 · Nearly two years after Snowden's revelations, 87% of Americans say they have heard about U.S. surveillance programs.Missing: empirical | Show results with:empirical
  102. [102]
    Encryption Backdoors: The Security Practitioners' View - SecurityWeek
    Jun 19, 2025 · When government demands something, 'No' is not an acceptable response. Government simply waits, rephrases the demand, and then demands again.
  103. [103]
    Encryption Under Threat: The UK's Backdoor Mandate and Its ...
    May 19, 2025 · Weakening encryption not only undermines personal privacy but also jeopardizes national security and global cybersecurity standards.
  104. [104]
    The FTC's New Report Reaffirms Big Tech's Personal Data Overreach
    Oct 3, 2024 · The report confirms data privacy violations that consumer advocates have been sounding the alarm on for years.
  105. [105]
    The New Rules of Data Privacy - Harvard Business Review
    Feb 25, 2022 · People will not hoard their data assets. Instead, they'll invest them in companies that provide them with a return in the form of more and ...Missing: debates | Show results with:debates
  106. [106]
    [PDF] The Crypto Wars - Electronic Frontier Foundation
    Aug 19, 2025 · It is technologically impossible to give governments an encryption backdoor without weakening privacy and security for everyone. But ...Missing: arguments | Show results with:arguments
  107. [107]
    A CDO Call To Action: Stop Hoarding Data—Save The Planet - Forbes
    Jul 20, 2023 · Most companies now have a serious data hoarding problem. It's estimated that anywhere from 50% to 90% of data collected and stored by organizations is dark ...<|separator|>
  108. [108]
    The state of privacy in post-Snowden America - Pew Research Center
    Sep 21, 2016 · The public generally believes it is acceptable for the government to monitor many others, including foreign citizens, foreign leaders and ...Missing: empirical | Show results with:empirical
  109. [109]
    The Promise and Pitfalls of AI-Assisted Code Generation
    Mar 21, 2025 · When ChatGPT was introduced in late 2022, it quickly transformed the coding landscape. Suddenly, developers had an accessible, ...
  110. [110]
    AI-Powered Low-Code Platform for Apps and Agents | OutSystems
    OutSystems AI-powered low-code provides everything you need to build, manage, and scale mission-critical apps and agents 10x faster with a single, enterprise- ...Full-stack development · Accelerated Development · OutSystems AI · Integration
  111. [111]
    What Gartner's Magic Quadrant really say about low-code vs no-code?
    Oct 13, 2025 · By 2025, 70 percent of new applications will be built using low-code or no-code platforms, up from less than 25 percent in 2020. The global low- ...
  112. [112]
    Hidden Risks of Shadow AI - Varonis
    Jun 30, 2025 · Shadow AI can result in non-compliance with regulations like GDPR, HIPAA, or CCPA if sensitive data is processed in unauthorized or unregulated ...Missing: 2020s | Show results with:2020s
  113. [113]
    Shadow AI is Everywhere: XM Cyber Finds 80% of Companies ...
    Aug 20, 2025 · The risks from Shadow AI run deeper. These tools can absorb sensitive data – including proprietary code, customer PII, financial models, and ...Missing: 2020s | Show results with:2020s
  114. [114]
    AI-Generated Code Statistics 2025: Can AI Replace Your ... - Netcorp
    Jul 30, 2025 · AI assistance leads to a 21% productivity boost in complex knowledge work, according to Microsoft-backed trials. AI agents improve ...
  115. [115]
  116. [116]
    Measuring the Impact of Early-2025 AI on Experienced ... - METR
    Jul 10, 2025 · We conduct a randomized controlled trial (RCT) to understand how early-2025 AI tools affect the productivity of experienced open-source developers.Missing: 2023-2025 | Show results with:2023-2025
  117. [117]
    Study reveals flaws and risks of AI-generated code - FutureCIO
    Aug 18, 2025 · The report found that Java is the riskiest language for AI code generation, with a security failure rate over 70%. Python, C#, and JavaScript ...
  118. [118]
    Survey: AI Tools are Increasing Amount of Bad Code Needing to be ...
    Jan 10, 2025 · Survey findings include respondents noted that AI tools are increasing the blast radius of the amount of bad code that needs to be debugged.
  119. [119]
    Microsoft Unveils Azure Virtual Desktop As New Name For WVD ...
    Jun 7, 2021 · Microsoft has renamed Windows Virtual Desktop (WVD) to Azure Virtual Desktop and announced more control via the Microsoft Endpoint Manager.
  120. [120]
    Microsoft Windows Virtual Desktop Azure DaaS Offering Now GA ...
    Oct 4, 2019 · It provides more flexibility in performance in that WVD can run any size virtual machine in Azure and allows multiple users on a single VM.
  121. [121]
    Microsoft Announces Windows 365 as Azure Virtual Desktop ...
    Starting in August 2021, users can access Windows 365 through a client application via Windows, Mac, iOS, Android or Linux devices and even through an HTML ...
  122. [122]
    Industry Trends in End-User Computing - Choice Solutions
    Mar 12, 2025 · Gartner forecasts that by 2025, 60% of enterprises will rely on RAS and virtualized workspaces to enhance operational agility and security, up ...Industry Trends In End-User... · The Rise Of Windows 365 And... · The Role Of Ai In End-User...<|control11|><|separator|>
  123. [123]
    XTIUM Visionary — 2025 Gartner® Magic Quadrant™ for DaaS
    New Highlights from the Report: Gartner forecasts that DaaS spending will grow from $4.3 billion in 2025 to $6.0 billion in 2029, with a compound annual growth ...
  124. [124]
    What is VDI (Virtual Desktop Infrastructure)? - AWS
    VDI brings centralized management to your virtual desktop environment. Administrators can patch, update, and change multiple virtual desktops at the same time.
  125. [125]
    Virtual Desktop Infrastructure (VDI): Types, Pros, Cons - Splashtop
    Oct 7, 2025 · The benefits of Persistent VDI include higher user satisfaction and productivity, as employees can store files and make system changes as ...
  126. [126]
    Building a Business Case for VDI: Key Considerations for IT Leaders
    Rating 4.9 (5,682) According to a report from Gartner, organizations can save up to 30% on desktop management costs with a VDI solution. You'll also reduce the need for costly ...
  127. [127]
    How to Reduce Operating Costs: 6 Efficient Strategies | V2 Cloud
    Rating 9.4/10 (235) Research shows that companies using virtual desktops can reduce hardware spending by up to 40%.<|separator|>
  128. [128]
    Reducing Latency in Virtual Desktops: 11 Fixes That Actually Work
    Jul 7, 2025 · Tired of laggy virtual desktops? Learn how to reduce latency with real-world fixes—from quick network tweaks to backend upgrades—and ...
  129. [129]
    Remote workers (from another country) facing high latency/slowness ...
    May 21, 2024 · The issue we're fighting is the high latency and slowness they deal with when using our RMM and PSA tools. We've tried a handful of solutions.Virtual Desktop users constantly reporting latency and disconnects ...Hardware bottlenecks for remote desktop usage latency? - RedditMore results from www.reddit.com
  130. [130]
    Top Microsoft Technology Trends to Watch in 2025 - CloudServus
    Jan 17, 2025 · Windows 365 Growth: Cloud PC adoption will expand, providing employees with secure, consistent access to work environments from anywhere.
  131. [131]
    The Impact of Cloud computing in 2025 - Cyber Defense Magazine
    Aug 12, 2025 · Cloud computing reduces infrastructure costs by up to 30% for some enterprises, according to industry reports. But more importantly, businesses ...Missing: end- | Show results with:end-
  132. [132]
    Windows 365 brings resilient, AI-driven cloud productivity to more ...
    Sep 17, 2025 · In April 2025 we introduced Windows 365 Link in seven markets, streamlining access to Cloud PCs via simple, secure, purpose-built devices.
  133. [133]
    Shadow IT Statistics: Key Facts to Learn in 2025 - Zluri
    Approximately 85% of businesses worldwide have encountered cyber incidents in the past two years, with 11% due to unauthorized shadow IT usage. 60% of ...
  134. [134]
    No. 1 Reason The 'Shadow IT' Trend Is Benefiting Careers But ...
    May 5, 2024 · One study found that 80% of company employees adopt “Shadow IT” for their convenience and productivity—they feel they can work more efficiently ...
  135. [135]
    ChatGPT is the number one offender in shadow IT, report finds
    May 22, 2024 · ChatGPT's meteoric rise has supplanted dozens of other apps to become top dog in shadow IT rankings, according to a report from spending management firm ...Missing: incidents | Show results with:incidents
  136. [136]
    Shadow IT surge threatens corporate data: report - CFO Dive
    May 21, 2024 · Dive Brief: The volume of corporate data placed into artificial intelligence tools by workers skyrocketed 485% from March 2023 to March 2024, ...Missing: prevalence | Show results with:prevalence
  137. [137]
  138. [138]
    Who Is Responsible When AI Breaks the Law?‌‌ | Yale Insights
    Dec 11, 2024 · If an AI is a black box, who is liable for its actions? The owner of the platform? The end user? Its original creator?
  139. [139]
    Shadow AI doubles every 18 months, creating blind spots SOCs ...
    Sep 16, 2025 · Deepfake incidents surged 3,000% in 2023. Contact centers experienced a 700% increase in voice-based attacks. By 2024, convincing voice ...
  140. [140]
    Regional and International AI Regulations and Laws in 2025
    Jul 5, 2024 · In this article, we'll move from specific legal issues of AI to the general regulatory landscape, focusing on the EU and the US as global trendsetters for ...