Fact-checked by Grok 2 weeks ago

Kernel

A kernel is the core or central part of something. The term is used in various fields with specific meanings. In biology, it refers to the inner part of seeds, nuts, or grains. In , it denotes the kernel of a or . In , it can mean the operating system kernel or kernel methods in . In physical sciences, it describes certain functions or structures in physics and engineering. Additionally, "Kernel" is used as a name for companies and software projects.

Biology

Seed and nut kernels

In botany, a kernel refers to the inner core of a or , typically consisting of the or the -containing portion that is protected by an outer , , or pericarp. This structure serves as the primary nutritive tissue for the developing , enabling its growth until . The term originates from the word "cyrnel," derived from "corn," meaning a small or , which over time evolved to signify the essential core or heart of something. Common examples of kernels include those found in tree nuts and certain fruit stones, such as the , which is the edible seed inside the almond's hard ; the , comprising the wrinkled, oily seed within the walnut's ; and the peach stone kernel, the inner of the that contains the . These kernels are distinct from the surrounding protective layers, which vary in thickness and composition depending on the . Botanically, kernels are rich in essential nutrients, primarily composed of for , oils or for structural development, and proteins for enzymatic and growth functions. For instance, contain approximately 50-60% oils, mainly monounsaturated fats, alongside 20% proteins and 20% carbohydrates including . are similarly lipid-dominant, with about 60-70% unsaturated fatty acids, complemented by proteins and minimal . In , the kernel's composition supports and provides the reserves needed for , where enzymes break down stored starches and proteins to fuel the emerging until begins. Humans have utilized seed and nut kernels for millennia, primarily for culinary purposes such as direct , , or processing into oils and butters. Almond kernels, for example, are pressed to extract almond oil used in cooking and , while walnut kernels feature in and salads. Nutritionally, these kernels are valued for their high content of healthy fats, such as omega-3s in walnuts, which support cardiovascular health, along with vitamins (e.g., ) and minerals like magnesium. Historical evidence indicates that nut kernels were gathered and cultivated since prehistoric times, with archaeological finds of almond and walnut remains dating back over 10,000 years in the , marking early human reliance on them as a source.

Grain kernels

A grain kernel constitutes the entire of cereal plants in the family, serving as the primary harvestable unit and storage organ for nutrients; in (Zea mays L.), it exemplifies this structure as a where the pericarp fuses with the seed coat. These kernels form the basis of staple crops like , , and , providing carbohydrates, proteins, and oils essential for human and animal nutrition. The internal anatomy of a maize kernel includes three main components: the pericarp, a tough outer protective layer derived from the wall that comprises about 5-6% of the kernel's weight; the , the largest portion at 75-85%, rich in and proteins for during ; and the , or , accounting for 10-12% and containing oils, vitamins, and the genetic material for new growth. This enables efficient processing while preserving viability. Varieties of kernels vary by endosperm type and intended use, including dent corn with soft, starchy endosperm that indents upon drying, ideal for animal feed and industrial applications; flint corn with hard, vitreous endosperm for storage and grinding; and sweet corn, featuring high sugar content in the endosperm for fresh before starch conversion. Maize kernels trace their agricultural origins to domestication from teosinte in around 7000 BCE, where early farmers in present-day selectively bred wild grasses for larger, more nutritious seeds, marking a pivotal shift in human . Following the after 1492 CE, maize kernels spread rapidly from the Americas to , , and , adapting to diverse climates and becoming a global staple crop that supported and dietary diversification. In modern production, maize kernels are harvested mechanically when moisture content reaches 15-25% to minimize damage, followed by drying to 13-15% for storage and transport. Processing methods include dry milling, which separates the kernel into , , and for products, and wet milling, which isolates , , and for industrial uses; transforms select kernels under heat and pressure into expanded snacks like . Economically, maize underscores global , with the leading as the top producer at approximately 378 million metric tons in the 2024/2025 marketing year. Maize kernels serve multifaceted uses, primarily as human food in forms such as , tortillas from nixtamalized , and cereals; as feed, accounting for about 40% of U.S. corn use due to its high content; and increasingly for , with production from kernels surging from negligible levels in 2000 to over 50 billion liters annually by the , driven by policy incentives like the Renewable Fuel Standard. This versatility highlights the kernel's role in balancing , feed, and demands.

Mathematics

Kernels in algebra

In algebra, the kernel of a homomorphism f: G \to H between groups G and H is defined as the set \ker(f) = \{g \in G \mid f(g) = e_H\}, where e_H is the identity element in H. This set consists of all elements in the domain that map to the identity, effectively capturing the "degeneracy" or loss of information in the mapping./11:_Homomorphisms/11.01:_Group_Homomorphisms) The kernel \ker(f) forms a normal subgroup of G, ensuring compatibility with the group structure under conjugation. This normality is crucial, as it allows the construction of the quotient group G / \ker(f). By the first isomorphism theorem, G / \ker(f) \cong \operatorname{Im}(f), where \operatorname{Im}(f) is the image of f, linking the kernel directly to the structure of the homomorphism's range. This concept generalizes to modules over a , where for a f: M \to N, the kernel \ker(f) = \{m \in M \mid f(m) = 0\} is a submodule of M. In the specific case of vector spaces, for a linear transformation T: V \to W between finite-dimensional vector spaces over a , the kernel \ker(T) = \{v \in V \mid T(v) = 0\} is a of V. The rank-nullity theorem states that \dim(\ker(T)) + \dim(\operatorname{Im}(T)) = \dim(V), quantifying the relationship between the kernel's dimension (nullity) and the image's dimension ()./07:_Linear_Transformations/7.02:_Kernel_and_Image_of_a_Linear_Transformation) In , the kernel of a f: A \to B in a category with a zero object is of f and the zero from A to B, providing a construction for the preimage of the . The notion of the kernel, particularly in the context of ideals as kernels of ring homomorphisms, was introduced by Emmy Noether in her foundational work on ring theory during the 1920s.

Kernels in analysis

In mathematical analysis, kernels appear prominently in the theory of integral operators and equations, where a kernel K(x, y) defines an operator that maps a function f to another function g via the integral transform g(x) = \int K(x, y) f(y) \, dy, with the integral taken over a suitable domain such as an interval or region in \mathbb{R}^n. This construction underlies Fredholm integral equations of the first kind, g(x) = \int K(x, y) f(y) \, dy, which seek to recover f from known g and K. Kernels are classified by their structure and properties, facilitating solvability and analysis. A degenerate kernel admits a finite-rank representation K(x, y) = \sum_{i=1}^m \phi_i(x) \psi_i(y), where \{\phi_i\} and \{\psi_i\} are finite sets of functions; this separability reduces the integral equation to a finite-dimensional algebraic system, enabling exact solutions in closed form. In contrast, a Hilbert-Schmidt kernel satisfies the square-integrability condition \iint |K(x, y)|^2 \, dx \, dy < \infty over the domain, ensuring the associated integral operator is compact on L^2 spaces and possesses a discrete spectrum with eigenvalues accumulating only at zero. Representative examples illustrate the role of kernels in specific contexts. The Dirac delta distribution \delta(x - y) acts as the trivial kernel for the identity operator, yielding \int \delta(x - y) f(y) \, dy = f(x) in the distributional sense, which reproduces the input function without alteration./09%3A_Transform_Techniques_in_Physics/9.04%3A_The_Dirac_Delta_Function) Another canonical example is the Poisson kernel for the unit disk in the complex plane, given by P_r(\theta) = \frac{1 - r^2}{1 - 2r \cos \theta + r^2}, \quad 0 \leq r < 1, \, -\pi \leq \theta \leq \pi, which solves the Dirichlet problem for harmonic functions by expressing the value at an interior point as a boundary integral: if u is harmonic in the disk with boundary values f(e^{i\phi}), then u(re^{i\theta}) = \frac{1}{2\pi} \int_{-\pi}^{\pi} P_r(\theta - \phi) f(e^{i\phi}) \, d\phi. In Fourier analysis, convolution kernels enable smoothing operations; for instance, convolving a function f with a low-pass kernel k, defined as (k * f)(x) = \int k(x - y) f(y) \, dy, attenuates high-frequency components, and the convolution theorem states that the Fourier transform of the result is the pointwise product of the individual transforms, \widehat{k * f} = \hat{k} \cdot \hat{f}, facilitating efficient computation and noise reduction. Key properties of kernels govern the behavior of the associated operators. A kernel is Hermitian if K(x, y) = \overline{K(y, x)} (or real-symmetric for real-valued cases), rendering the integral operator self-adjoint on L^2, with real eigenvalues and orthogonal eigenfunctions; this symmetry simplifies spectral analysis and ensures positive-definiteness for certain applications like reproducing kernel Hilbert spaces. In Fredholm theory, the eigenvalues of compact integral operators (e.g., those with continuous or Hilbert-Schmidt kernels) form a discrete sequence \{\lambda_n\} converging to zero, with finite multiplicity except possibly at zero, and the resolvent operator admits a Neumann series expansion for |\lambda| < 1/|\lambda_1|, where \lambda_1 is the spectral radius. The historical development traces to Vito Volterra's 1896 papers, which introduced integral equations of the first kind arising from inverting definite integrals, laying groundwork for equations with variable limits. Ivar Fredholm advanced the field in 1903 by establishing existence and uniqueness for equations with fixed limits and continuous kernels, introducing the resolvent kernel and spectral theory for the second-kind form. A paradigmatic equation is the Fredholm integral equation of the second kind, f(x) = g(x) + \lambda \int_a^b K(x, y) f(y) \, dy, whose solution exists uniquely for \lambda not an eigenvalue, via iteration or the Fredholm determinant \det(I - \lambda K) \neq 0.

Kernels in statistics

In statistics, kernels refer to specialized weighting functions employed in nonparametric methods for estimating probability density functions and performing smoothing operations in probabilistic models. A kernel K is typically a non-negative, symmetric function satisfying the normalization condition \int_{-\infty}^{\infty} K(u) \, du = 1, ensuring it acts as a valid density itself. This setup allows kernels to assign weights to data points based on their proximity to an evaluation point, facilitating data-driven approximations without assuming a parametric form for the underlying distribution. The primary application of kernels in statistics is kernel density estimation (KDE), a technique to construct an empirical estimate of an unknown density f from an independent and identically distributed sample X_1, \dots, X_n. The KDE is defined as \hat{f}(x) = \frac{1}{n h} \sum_{i=1}^n K\left( \frac{x - X_i}{h} \right), where h > 0 is a smoothing parameter known as the , controlling the degree of local averaging. KDE was first proposed by Rosenblatt in 1956 as a method for nonparametric , with Parzen providing key refinements in 1962, including asymptotic under suitable conditions on K and h. Beyond density estimation, kernels extend to in probabilistic models, such as estimating conditional densities or serving as building blocks for more complex estimators. Commonly used kernels balance computational simplicity, smoothness, and efficiency in minimizing estimation error. The Gaussian kernel is K(u) = \frac{1}{\sqrt{2\pi}} \exp\left( -\frac{u^2}{2} \right), offering infinite support and desirable tail behavior for densities. The Epanechnikov kernel, K(u) = \frac{3}{4} (1 - u^2) for |u| \leq 1 and 0 otherwise, is compactly supported and asymptotically optimal in terms of minimizing the mean integrated squared error (MISE) among kernels of 2. The uniform kernel, K(u) = \frac{1}{2} for |u| \leq 1 and 0 otherwise, provides a simple rectangular weighting but can lead to blockier estimates compared to smoother alternatives. Kernels find applications in , where underlies methods like the Nadaraya-Watson estimator to smooth response variables against predictors by weighting observations via kernel functions centered at the target point. selection is essential for performance, as it governs the resolution of the estimate; common approaches include cross-validation, which minimizes an empirical estimate of the integrated squared error by leaving out each observation in turn, and rule-of-thumb heuristics tailored to the kernel choice. Key properties of kernel-based estimators revolve around the bias-variance tradeoff: the term is typically O(h^2) for second-order kernels assuming sufficient of f, while the variance is O(1/(n h)), leading to pointwise (MSE) of order O(h^4 + 1/(n h)). Optimal bandwidths scale as O(n^{-1/5}) to minimize MSE, achieving rates faster than methods under minimal assumptions, provided h \to 0 and n h \to \infty as n \to \infty. These characteristics ensure consistent estimation and highlight the method's robustness for in statistics.

Computing

Operating system kernel

In computing, the operating system kernel is the central component of an operating system that acts as the primary between applications and , managing essential resources such as the CPU, , and devices. It operates in a privileged mode known as kernel mode, distinct from user mode where applications run, ensuring that only authorized code can access hardware directly to prevent or breaches. This separation enforces through mechanisms like hardware-enforced privilege levels, allowing the kernel to execute sensitive operations while restricting user programs to safer, sandboxed environments. The kernel's core functions include process management, which involves scheduling tasks on the CPU, creating and terminating es, and facilitating (IPC) via mechanisms like or message queues; , encompassing allocation, implementation through paging and segmentation to provide processes with isolated address spaces; and device management via drivers that abstract interactions, enabling uniform access to peripherals such as disks or networks. These functions ensure efficient resource utilization and system stability, with the kernel handling interrupts and switches to multitask effectively. For instance, in process scheduling, the kernel uses algorithms like priority-based to allocate slices, optimizing throughput while minimizing . Historically, the concept of a kernel emerged with , a pioneering system developed jointly by , , and starting in 1965, which introduced and protected memory but was complex and resource-intensive. Influenced by , and at created the first Unix kernel in 1969-1970, initially in assembly for the , emphasizing simplicity, portability, and a hierarchical file system; this evolved into the C-implemented by 1975, laying the foundation for modern systems. The , initiated by in 1991 as a free, monolithic alternative inspired by and Unix, rapidly grew through community contributions, reaching version 1.0 in 1994 and powering diverse platforms from servers to embedded devices. Kernels are classified into types based on architecture: monolithic kernels, where all core services like file systems and drivers run in a single for high performance but with reduced modularity (e.g., and traditional Unix); microkernels, which minimize the kernel to basic functions like and thread management, running other services as user-space processes for better reliability and security (e.g., by Andrew Tanenbaum, introduced in 1987 to teach OS principles); and hybrid kernels, blending monolithic efficiency with microkernel modularity by integrating key components into the kernel while allowing some user-space extensions (e.g., kernel, designed in the early 1990s for robustness across hardware). Monolithic designs excel in speed due to direct function calls, while microkernels enhance fault isolation, as a driver crash affects only its process, not the entire system. Hybrids, like , incorporate a layer for portability. Security in kernels relies on ring protection, a feature dividing privilege levels into concentric rings—typically Ring 0 for the kernel (full access) and Ring 3 for applications (limited access)—preventing unauthorized escalation via mechanisms like the x86 architecture's segment descriptors. The (syscall) interface provides a controlled gateway, where programs request kernel services through traps that switch modes, validating inputs to avoid direct manipulation; for example, the read() syscall fetches via a vetted buffer. However, vulnerabilities persist, such as buffer overflows where excessive input overwrites adjacent , potentially allowing or ; notable exploits include Linux's (CVE-2016-5195), which evaded protections. These issues underscore the need for rigorous auditing, as kernel bugs can compromise the entire system. As of November 2025, modern kernel developments emphasize safety and scalability, with the in its 6.x series (e.g., 6.17 released in September 2025, with 6.18 in release candidate stage as of mid-November) incorporating language support for new modules to mitigate memory safety issues prevalent in C code. 's ownership model prevents common errors like null pointer dereferences, with initial drivers (e.g., for NVMe and GPIO) merged since kernel 6.1 in 2022; by 2025, additional abstractions for core areas have expanded its footprint, including support in 6.12 (designated LTS in December 2024) and hardening against in 6.17, aiming for broader adoption without destabilizing the C base. This hybrid approach, debated in kernel mailing lists, balances innovation with compatibility, reducing vulnerability classes like use-after-free by up to 70% in components per early analyses.

Kernel methods in machine learning

Kernel methods in are a class of algorithms that enable the handling of nonlinear by implicitly mapping inputs into high-dimensional spaces through kernel functions. A kernel function K(\mathbf{x}, \mathbf{y}) computes the inner product between vectors \phi(\mathbf{x}) and \phi(\mathbf{y}) in this space, defined as K(\mathbf{x}, \mathbf{y}) = \langle \phi(\mathbf{x}), \phi(\mathbf{y}) \rangle, without explicitly constructing the mapping \phi, which could be computationally prohibitive for high dimensions. This approach leverages linear algorithms in the transformed space to achieve nonlinear decision boundaries in the original input space. The kernel trick is the core technique that facilitates this implicit mapping, substituting direct dot products with kernel evaluations during optimization, thereby avoiding the explicit computation of \phi. For instance, the (RBF) kernel, K(\mathbf{x}, \mathbf{y}) = \exp\left( -\frac{\|\mathbf{x} - \mathbf{y}\|^2}{2\sigma^2} \right), corresponds to an infinite-dimensional feature space and is widely used for its flexibility in capturing complex patterns. Other common kernels include the , K(\mathbf{x}, \mathbf{y}) = (\mathbf{x} \cdot \mathbf{y} + c)^p, which extends linear models to higher-degree polynomials, and the kernel, K(\mathbf{x}, \mathbf{y}) = \tanh(\kappa \mathbf{x} \cdot \mathbf{y} + c), inspired by neural networks. These kernels must be positive semi-definite to ensure the feature space is well-defined, as guaranteed by , which states that a continuous symmetric positive semi-definite kernel can be expressed as K(\mathbf{x}, \mathbf{y}) = \int \phi(\mathbf{x}) \phi(\mathbf{y}) \, d\mu for some measure \mu. In support vector machines (SVMs), the kernel trick transforms the primal optimization problem into its form, which depends only on inner products via the kernel matrix. The objective is to maximize L_D = \sum_{i=1}^n \alpha_i - \frac{1}{2} \sum_{i,j=1}^n \alpha_i \alpha_j y_i y_j K(\mathbf{x}_i, \mathbf{x}_j), subject to $0 \leq \alpha_i \leq C and \sum_{i=1}^n \alpha_i y_i = 0, where \alpha_i are Lagrange multipliers, y_i are labels, and C is the regularization parameter; the decision function then becomes f(\mathbf{x}) = \sum_{i \in SV} \alpha_i y_i K(\mathbf{x}_i, \mathbf{x}) + b, with support vectors (SV) identified by nonzero \alpha_i. This formulation enables SVMs for both and (SVR) by handling nonlinear separability efficiently. Kernel methods extend to other applications, such as Gaussian processes (GPs), where the kernel defines the covariance function of a distribution over functions, allowing probabilistic predictions for regression and classification tasks. In kernel principal component analysis (kernel PCA), the kernel trick nonlinearizes standard PCA by eigendecomposing the kernel matrix to extract principal components in the feature space, useful for dimensionality reduction and data visualization. The origins trace to the 1992 introduction of kernel-based SVMs by Boser, Guyon, and Vapnik, building on earlier potential function methods, with widespread adoption accelerating in the 2000s through libraries like scikit-learn, which integrated kernel SVMs and GPs for practical use.

Physical sciences

Kernels in physics

In physics, a kernel often refers to a or an interaction term that facilitates the solution of linear differential equations describing physical systems, such as potentials or propagators. , introduced by George Green in 1828 to solve for electrostatic potentials, serve as integral kernels that transform source terms into observable fields. For instance, in , the kernel corresponds to the fundamental solution of in free space, given by G(\mathbf{r}, \mathbf{r}') = -\frac{1}{4\pi |\mathbf{r} - \mathbf{r}'|}, which yields the inverse-distance potential $1/r for point charges. A prominent example of such kernels is the , proposed by in 1935 to model short-range nuclear forces between protons and neutrons. This interaction kernel takes the form V(r) = -\frac{g^2}{r} e^{-\mu r}, where g is a and \mu sets the range, decaying exponentially unlike the long-range Coulomb potential; it arises as the static limit of a massive propagator in . Kernels like this enable the computation of binding energies in nuclear models, such as the deuteron, by integrating over wave functions. In , kernels appear in the formulation, where the off-diagonal elements \rho(\mathbf{r}, \mathbf{r}') = \langle \mathbf{r}' | \hat{\rho} | \mathbf{r} \rangle act as a position-space kernel representing mixed states and expectation values via traces. Similarly, in scattering theory, the kernel of the Lippmann-Schwinger equation or the encodes transition amplitudes between asymptotic states, with the T-matrix serving as an interaction kernel that relates incoming and outgoing waves through iterative integrals. Applications of these kernels are central to solving boundary value problems, such as the Poisson equation \nabla^2 \phi = -\rho / \epsilon_0 for electrostatic potential \phi, whose general solution is the \phi(\mathbf{r}) = \int G(\mathbf{r}, \mathbf{r}') \frac{\rho(\mathbf{r}')}{\epsilon_0} \, d^3\mathbf{r}', where G is the appropriate satisfying boundary conditions. In relativistic contexts, retarded and advanced kernels address in wave equations, like the equation (\square + m^2) \phi = J; the retarded G^R(x, x') = -\frac{\theta(x^0 - x'^0) \delta((x - x')^2)}{2\pi} for massless fields propagates signals forward in time, while the advanced version does so backward, ensuring solutions respect light-cone structure in . Historically, the kernel formalism evolved significantly in after the 1940s, with renormalization techniques by , , and Sin-Itiro Tomonaga incorporating Green's functions as propagators for Feynman diagrams, linking perturbative expansions to measurable scattering cross-sections. This post-war development solidified kernels as indispensable tools for handling infinities and interactions in relativistic quantum systems.

Kernels in engineering

In engineering, particularly in signal processing and image analysis, a convolution kernel refers to a small or that performs filtering operations on input data by sliding over it and computing weighted sums of neighboring elements. This operation modifies the input signal or image to enhance specific features, such as detecting edges in images where a kernel like the highlights intensity gradients. In , kernels are integral to () and () filters, where the kernel coefficients define the filter's . For filters, the output is computed via discrete convolution, given by the equation y = \sum_{k=0}^{M-1} h x[n-k], where h are the kernel coefficients, x is the input signal, and y is the filtered output; this directly implements linear filtering without . filters, while recursive, can approximate kernel-based convolutions for efficiency in real-time applications like audio processing. Practical applications of kernels abound in image and audio engineering. For image sharpening, the Laplacian kernel, such as \begin{bmatrix} 0 & -1 & 0 \\ -1 & 5 & -1 \\ 0 & -1 & 0 \end{bmatrix}, amplifies high-frequency details by emphasizing local intensity changes. Conversely, Gaussian kernels enable blurring for , with the kernel derived from a 2D to smooth edges softly. In audio engineering, kernel-based filters facilitate equalization by adjusting responses, such as boosting via low-pass kernels. The two-dimensional operation central to processing is defined as (f * g)(x,y) = \sum_{i} \sum_{j} f(i,j) g(x-i, y-j), where f is the input and g is the kernel, producing a map that captures localized patterns. In , kernel-based methods support for nonlinear dynamics, mapping input-output data into a higher-dimensional space to model complex behaviors without assuming linearity. These approaches, often using kernels, enable accurate prediction of system responses in applications like and . The use of kernels in engineering traces to the 1970s boom in , driven by the (FFT) algorithm, which accelerated computations via frequency-domain efficiency. Convolutional neural networks (CNNs), employing learnable kernels, emerged in the 1980s through Yann LeCun's work on . As of 2025, kernels remain pivotal in AI hardware accelerators, such as Google's Tensor Processing Units (TPUs), which optimize operations for efficient in large-scale image and signal tasks, achieving high throughput with specialized systolic arrays.

Organizations

Companies

Kernel Holding S.A. is a major company specializing in the production and export of and s. Founded in 1995 by Andriy Verevskyi as a trading firm in agricultural products, it expanded into processing and farming, with the holding structure established in 2005. The company went public with an on the in November 2007, becoming one of the first Ukrainian firms listed there. By fiscal year 2023 (July 2022–June 2023), Kernel processed approximately 2.5 million tons of oilseeds, primarily sunflower seeds, yielding around 1.1 million tons of sold, accounting for about 6% of global production and making it the world's largest sunflower oil producer. Its operations span farming over 363,000 hectares, grain storage with 2.3 million tons capacity, and export terminals, supplying products to more than 70 countries. The in February 2022 severely disrupted Kernel's operations, blocking ports and causing logistical challenges that reduced grain exports by 54% to 3.7 million tons in FY2023 compared to prior years. attacks resulted in direct losses, including 1,000 tons of valued at $260,000 and equipment damage costing $11.2 million, alongside inventory write-downs of $65.7 million due to export delays. Despite these setbacks, Kernel adapted by utilizing alternative routes like the River and rail, benefiting from the to export over 90% of its grain and 50% of its oil via sea, while providing $12.3 million in aid to the Ukrainian Armed Forces and supporting 1,478 enlisted employees. Kernel, a company based in , focuses on developing non-invasive brain-computer interfaces to record and analyze neural activity for cognitive enhancement, including memory and mental health applications. Founded in 2016 by entrepreneur , who invested $54 million initially from proceeds of selling his payment company Braintree to , the firm aims to make data accessible for research and therapeutic uses, such as monitoring function in real-time without surgery. In 2020, Kernel raised $53 million in its first external funding round, led by General Catalyst and , bringing total funding to over $100 million and enabling development of devices like the Kernel Flow headset for high-resolution brain signal detection. As of 2025, the company continues to advance its brain-computer interface technology, focusing on scalable non-invasive neural recording for research and clinical applications. Several companies leverage the name "Kernel" to evoke core innovations in agriculture and technology sectors. For instance, Kernel (the restaurant concept), launched in February 2024 by founder , initially offered plant-based chicken dishes prepared by robots in a location, emphasizing efficient, sustainable fast-casual dining before closing in 2025 and relaunching as Counter Service, a sandwich shop incorporating animal proteins. These entities highlight a thematic use of "kernel" to signify foundational advancements, from seed-to-oil supply chains to neural data processing, though they face distinct challenges like geopolitical risks in and regulatory hurdles in neurotech.

Software projects

The Linux kernel, an open-source Unix-like monolithic kernel, was first released by Linus Torvalds in 1991 as a personal project to create a free operating system for personal computers. By November 2025, the kernel had reached version 6.17 as its latest stable release, with ongoing development toward 6.18, supporting a wide array of hardware architectures and powering embedded devices, servers, and mobile platforms including Android. The kernel's ubiquity in server environments stems from its robust networking and process management capabilities, while its adaptation for Android involves modifications for mobile-specific features like power management. Community governance occurs primarily through the Linux Kernel Mailing List (LKML), where developers submit patches, review code, and coordinate releases under Torvalds' direction. The XNU kernel, a hybrid design combining microkernel and monolithic elements, forms the core of Apple's Darwin operating system, underlying macOS and iOS. Derived from the Mach microkernel developed at Carnegie Mellon University in the 1980s and integrated with BSD subsystems for Unix compatibility, XNU received significant enhancements from Apple following its 1997 acquisition of NeXT Software, which had pioneered the Mach-BSD hybrid in NeXTSTEP. Apple's contributions since then have focused on optimizing for Apple Silicon processors, adding features like kernel extensions for hardware acceleration, while maintaining open-source availability of the base kernel through the Darwin project. Google's Fuchsia operating system employs the Zircon kernel, a microkernel initiated in 2016 as a modular, capability-based foundation aimed at embedded and IoT devices, with potential scalability to larger systems. Unlike Linux-based projects, Zircon emphasizes secure inter-process communication and real-time performance, supporting languages like C++, Rust, and Dart for driver and application development. Historically, the THE multiprogramming system, developed in 1968 by Edsger W. Dijkstra and his team at Eindhoven University, represented an early kernel-like structure for batch processing on the Electrologica X8 computer, pioneering concepts like layered design and semaphore-based synchronization across multiple processes. Kernel development practices emphasize rigorous and security maintenance; the Linux kernel adopted in 2005, enabling distributed collaboration and efficient handling of its vast codebase exceeding 30 million lines. Security patches address vulnerabilities promptly, as seen in 2024 fixes for issues like CVE-2024-1086, a flaw in the netfilter subsystem exploited in targeted attacks, integrated into stable releases via backports. The Linux kernel's impact is evident in , powering 100% of the world's top 500 supercomputers as of the November 2025 list, due to its customizability for parallel processing and hardware optimization.

References

  1. [1]
    What Is an OS Kernel? | Baeldung on Computer Science
    Jun 10, 2024 · The kernel is the most important part of the operating system. It is the primary interface between the hardware and the processes of a computer.
  2. [2]
    OS Processes - CS 3410 - Cornell: Computer Science
    The kernel is a special piece of software that forms the central part of the operating system. You can think of it as being sort of like a process, except that ...
  3. [3]
    Introduction to Operating Systems
    The central module of an operating system is the 'kernel'. It is the part of the operating system that loads first, and it remains in main memory.
  4. [4]
    Kernel Basics — Computer Systems Fundamentals
    The kernel is a program that runs with full access privileges to the entire computer. The kernel controls access to all shared system resources.Missing: definition | Show results with:definition
  5. [5]
    Kernel – CS 61 2019
    The kernel is the operating system software that runs with full machine privilege, meaning full privilege over all machine resources.Missing: definition | Show results with:definition
  6. [6]
    [PDF] OS Kernel and User Process Relationship
    Everything that runs in CPU hardware “kernel” mode. • The “operating system kernel”. • CPU hardware knows it is either in “user mode” or in “kernel mode”.
  7. [7]
    Kernel in Operating System (OS) - Scaler Topics
    Jun 16, 2024 · Kernels are of five types, namely monolithic, microkernel, nanokernel, hybrid kernel and exokernel. Functions of a kernel include scheduling ...Types Of Kernels · 1. Monolithic Kernel · 2. Microkernel Kernels
  8. [8]
    Types of Operating System Kernel Structures and Virtual Machines
    May 24, 2021 · OS Kernel Structures · 1. Monolithic Kernel · 2. Microkernel · 3. Semi-Microkernel · 4. Exokernel · 5. Kernel Bypassing (Direct-Access Library ...
  9. [9]
    Understanding the Linux Kernel | Sysdig
    The Linux kernel is the interface between hardware and computer processes, managing low-level functions and unifying hardware and software.
  10. [10]
    Processing maize flour and corn meal food products - PubMed Central
    Dec 11, 2013 · The maize kernel is composed of four primary structures from a processing perspective. They are endosperm, germ, pericarp, and tip cap, making ...
  11. [11]
    Whole Grains | Linus Pauling Institute | Oregon State University
    Grains are seeds of plants belonging to the Poaceae family (also called Gramineae or true grasses). Some examples of edible grains include wheat, rice, maize ( ...
  12. [12]
    BE BOLD. Shape ... - Specialty Corns | New Mexico State University
    This publication describes several varieties of specialty corns, including dent corn, sweet corn ... The most common types of corn include flint, flour, dent, pop ...
  13. [13]
    [PDF] A Brief History of Corn: Looking Back to Move Forward
    May 6, 2016 · Maize was domesticated from teosinte in Mexico some 7,000 to 10,000 years ago and quickly spread through the Americas.Missing: BCE | Show results with:BCE
  14. [14]
    Plant Genetics Research: Columbia, MO - Publication : USDA ARS
    Nov 30, 2017 · The Columbian exchange allowed maize to spread around the world, to adapt to new environments and fill its current role as a major crop feeding ...
  15. [15]
    Harvesting Corn: What Grain Moisture Should I Harvest Corn At?
    Aug 17, 2018 · Corn growers should always try to sell grain at or above 15.5% to avoid losing weight to shrinkage resulting in economic loss.Missing: maize popping
  16. [16]
    Production - Corn - USDA Foreign Agricultural Service
    Market, % of Global Production, Total Production (2024/2025, Metric Tons). United States, 31%, 377.63 Million. China, 24%, 294.92 Million.
  17. [17]
    [PDF] The Ethanol Decade:An Expansion of U.S. Corn Production, 2000-09
    Between 2000 and 2009, corn used for ethanol increased by 3.7 billion bushels, while total corn production increased by 3.2 billion bushels (fig. 1).Missing: maize kernels
  18. [18]
    [PDF] 18.703 Modern Algebra, Homomorphisms and kernels
    Let φ: G -→ H be a group homomorphism. The kernel of φ, denoted Ker φ, is the inverse image of the identity. Then Ker φ is a subgroup of G.
  19. [19]
    Group Theory - The Isomorphism Theorems
    First Isomorphism Theorem: Let be a group homomorphism. Let E be the subset of G that is mapped to the identity of G ′ . E is called the kernel of the map φ.
  20. [20]
    First Group Isomorphism Theorem -- from Wolfram MathWorld
    The first group isomorphism theorem, also known as the fundamental homomorphism theorem, states that if phi:G->H is a group homomorphism, then Ker(phi)⊴G ...
  21. [21]
    [PDF] The Rank-Nullity Theorem - Purdue Math
    Feb 16, 2007 · For a given matrix A, be able to determine the rank from the nullity, or the nullity from the rank. Know the relationship between the rank of a ...
  22. [22]
    Equalizer -- from Wolfram MathWorld
    For these, the kernel of a morphism can be viewed, in a more abstract categorical setting, as the equalizer of. and the zero map.
  23. [23]
    Idealtheorie in Ringbereichen | Mathematische Annalen
    Ideale in nicht-kommutativen Ringbereichen aus Polynomen bilden den Gegenstand der gemeinsamen Arbeit Noether-Schmeidler.
  24. [24]
    [PDF] FREDHOLM, HILBERT, SCHMIDT Three Fundamental Papers on ...
    Dec 15, 2011 · From this work emerged four general forms of integral equations now called Volterra and Fredholm equations of the first and second kinds (a ...
  25. [25]
    Sur une classe d'équations fonctionnelles - SpringerLink
    Fredholm, I. Sur une classe d'équations fonctionnelles. Acta Math. 27, 365–390 (1903). https://doi.org/10.1007/BF02421317
  26. [26]
    Superconvergent Nyström and degenerate kernel methods for ...
    In this paper, we have applied the superconvergent Nyström and degenerate kernel methods for approximating the solutions of Hammesrtein integral equations. We ...
  27. [27]
    [PDF] Harmonic Function Theory - Sheldon Axler
    The function PH is called the Poisson kernel for the upper half-space. Note that PH can be written as. PH(z, t) = cn y. |z − t|n . In this form, PH reminds ...<|separator|>
  28. [28]
    [PDF] Smoothing and Convolution - UMD CS
    In fact, attenuating high frequency components of a signal can be taken to be the definition of smoothing. To some extent, this is why we learned the Fourier.Missing: analysis | Show results with:analysis
  29. [29]
    characteristic parameter values for an integral equation
    A continuous* kernel K(x,y) is closed if the equation. JK(x,s)u(s)ds = 0 is satisfied by no continuous function except u(x) = 0. The properties of Hermitian ...
  30. [30]
    1896–1996: One hundred years of Volterra integral equations of the ...
    We review Vito Volterra's seminal papers (on the inversion of definite integrals) of 1896, with regard to their mathematical results and within the context ...Missing: original | Show results with:original
  31. [31]
    Remarks on Some Nonparametric Estimates of a Density Function
    September, 1956 Remarks on Some Nonparametric Estimates of a Density Function. Murray Rosenblatt · DOWNLOAD PDF + SAVE TO MY LIBRARY. Ann. Math. Statist. 27(3): ...
  32. [32]
    On Estimation of a Probability Density Function and Mode
    September, 1962 On Estimation of a Probability Density Function and Mode. Emanuel Parzen · DOWNLOAD PDF + SAVE TO MY LIBRARY. Ann. Math. Statist.Missing: original | Show results with:original
  33. [33]
    None
    ### Summary of KDE in Nonparametric Regression from the Article
  34. [34]
    What is a Kernel? | Definition from TechTarget
    Aug 1, 2024 · A kernel is the essential foundation of a computer's operating system (OS). It's the core that provides basic services for all other parts of the OS.What Are Device Drivers? · Kernel Mode Vs. User Mode · Types Of Kernels
  35. [35]
    History - Multics
    Jul 31, 2025 · Multics (Multiplexed Information and Computing Service) is a mainframe time-sharing operating system begun in 1965 and used until 2000.Summary of Multics · Inception · Beginnings · Use at MIT
  36. [36]
    [PDF] The UNIX Time- Sharing System
    Ritchie and Ken Thompson. Bell Laboratories. UNIX is a general-purpose, multi-user, interactive operating system for the Digital Equipment Corpora- tion PDP ...
  37. [37]
    The history of how Unix started and influenced Linux - Red Hat
    Nov 11, 2022 · Take a look back at how Unix started. In 1969, Ken Thompson, a researcher at Bell Labs, was experimenting with operating system designs.
  38. [38]
    Lessons Learned from 30 Years of MINIX
    Mar 1, 2016 · The microkernel was compiled as a standalone executable program. Each of the other operating system components, including the file system and ...
  39. [39]
    Refined Speculative Execution Terminology - Intel
    Mar 11, 2022 · The adversary's domain is a ring 3 application, and the victim domain is the operating system kernel (ring 0). When the adversary attempts ...
  40. [40]
    [PDF] Real-World Buffer Overflow Protection for Userspace & Kernelspace
    To evaluate the security of our approach, we exploited real-world user/kernel pointer dereference and buffer overflow vulnerabilities in the Linux kernel.
  41. [41]
    Upcoming Rust language features for kernel development - LWN.net
    Oct 8, 2025 · The Rust for Linux project is trying not to use any new unstable features (and to compile with a version of Rust equal to or older than the ...
  42. [42]
    How to write Rust in the kernel: part 2 - LWN.net
    Jun 27, 2025 · Like C, Rust modules are located starting from a search path and then continuing down a directory tree. Unlike C, a use statement can ...
  43. [43]
    [PDF] Kernel methods in machine learning - arXiv
    Kernel methods use positive definite kernels to formulate learning in a reproducing kernel Hilbert space, acting as a dot product in a feature space.
  44. [44]
    [PDF] A Tutorial on Support Vector Machines for Pattern Recognition
    We describe how support vector training can be practically implemented, and discuss in detail the kernel mapping technique which is used to construct. SVM ...
  45. [45]
    [PDF] A Training Algorithm for Optimal Margin Classi ers
    A training algorithm that maximizes the mar- gin between the training patterns and the de- cision boundary is presented. The technique.Missing: SVM | Show results with:SVM
  46. [46]
    Gaussian Processes and Kernel Methods: A Review on ... - arXiv
    Jul 6, 2018 · This paper is an attempt to bridge the conceptual gaps between researchers working on the two widely used approaches based on positive definite kernels.
  47. [47]
    [PDF] Nonlinear Component Analysis as a Kernel Eigenvalue Problem
    A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one.
  48. [48]
    [PDF] Chapter 8 Green's Functions - Differential Equations
    The history of the Green's function dates back to 1828, when George. Green published work in which he sought solutions of Poisson's equation. ∇2u = f for the ...
  49. [49]
    Coulomb Green's Function | Journal of Mathematical Physics
    The Coulomb Green's function is a momentum space Green's function for the nonrelativistic Coulomb problem, with a one-parameter integral representation.
  50. [50]
    On the Interaction of Elementary Particles I - Inspire HEP
    On the Interaction of Elementary Particles I. Hideki Yukawa(. Osaka U. ) Feb, 1935. 11 pages. Published in: Proc.Phys.Math.Soc.Jap. 17 (1935) 48-57,; Prog ...
  51. [51]
    [PDF] Notes on the analytic S-matrix (under construction)
    In these notes, we will discuss the basic properties of nonperturbative relativistic scattering amplitudes. Understanding these is the subject of S-matrix ...
  52. [52]
    [PDF] 4 Green's Functions
    A Green's function for Ω is G(x, y) = Φ(y − x) − hx(y) x, y ∈ Ω,x 6= y, where Φ is the fundamental solution of Laplace's equation and hx is a solution of (4.5).
  53. [53]
    Green's function for gravitational waves in FRW spacetimes - arXiv
    Sep 24, 1993 · A method for calculating the retarded Green's function for the gravitational wave equation in Friedmann-Roberson-Walker spacetimes
  54. [54]
    Quantum Field Theory > The History of QFT (Stanford Encyclopedia ...
    In 1934 a new type of fields (scalar fields), described by the Klein-Gordon equation, could be quantized (another example of “second quantization”). This new ...
  55. [55]
    [PDF] 1 Convolution - Cornell: Computer Science
    Feb 27, 2013 · Convolution is an important operation in signal and image processing. Convolution op- erates on two signals (in 1D) or two images (in 2D): ...
  56. [56]
    [PDF] Convolution
    Convolution is a mathematical way of combining two signals to form a third signal. It is the single most important technique in Digital Signal Processing.
  57. [57]
    Spatial Filters - Laplacian/Laplacian of Gaussian
    The Laplacian is a 2-D isotropic measure of the 2nd spatial derivative of an image. The Laplacian of an image highlights regions of rapid intensity change.Missing: audio equalization
  58. [58]
    2D Convolution in Image Processing - Technical Articles
    Nov 30, 2018 · This article provides insight into two-dimensional convolution and zero-padding with respect to digital image processing.
  59. [59]
    [PDF] Kernel methods and Gaussian processes for system identification ...
    Sep 2, 2023 · In this survey, we focus on Kernel-based methods, and their. Bayesian interpretation leading to Gaussian processes. [17]. A fundamental feature ...
  60. [60]
    1979: Single Chip Digital Signal Processor Introduced
    2 x 4 multipliers from Fairchild (9334) and AMD (2505) in 1970 were among the first standard IC products to speed math-intensive signal-processing algorithms.
  61. [61]
    A Survey on Deep Learning Hardware Accelerators for ...
    Tensor Processing Units (TPUs) dedicated to training and inference have been proposed very early after the emergence of the first large CNN-based applications.
  62. [62]
    LINUX's History by Linus Torvalds
    Note: The following text was written by Linus on July 31 1992. It is a collection of various artifacts from the period in which Linux first began to take ...
  63. [63]
    The Linux Kernel Archives
    The Linux Kernel Archives ; mainline: 6.18-rc4, 2025-11-02 ; stable: 6.17.7, 2025-11-02 ; stable: 6.16.12 [EOL], 2025-10-12 ; longterm: 6.12.57, 2025-11-02 ...Releases · The Linux Kernel Organization · The Linux Kernel documentation · FAQ
  64. [64]
    System and kernel security | Android Open Source Project
    Oct 9, 2025 · The Android platform provides the security of the Linux kernel, as well as a secure inter-process communication (IPC) facility to enable secure communication ...
  65. [65]
    LKML.ORG - the Linux Kernel Mailing List Archive
    LKML.ORG? In case you haven't read the titlebar of your webbrowser's window: this site is the (unofficial) Linux Kernel Mailing List archive. This ...Hottest Messages · Maintainers / Kernel Summit... · Linus Torvalds: Linux 6.15Missing: governance | Show results with:governance
  66. [66]
    The structure of the “THE”-multiprogramming system
    The structure of the “THE”-multiprogramming system. Author: Edsger W. Dijkstra ... Published: 01 May 1968 Publication History. 830citation19,464Downloads.