Fact-checked by Grok 2 weeks ago

Preprocessor

A preprocessor in is a language processor that accepts input statements written in one and generates output statements syntactically compatible with another , typically transforming before compilation or further processing. This tool programmatically alters its input based on inline annotations, such as directives, to produce modified data for use by compilers, interpreters, or other programs. Preprocessors enable features like macro substitution, conditional compilation, and file inclusion, streamlining code development and maintenance across various domains. One of the most prominent implementations is the (often abbreviated as cpp), integrated into the GNU Compiler Collection () and other C/C++ toolchains, which automatically processes source files before compilation. It supports a macro language for defining constants, functions, and code blocks that are expanded inline, along with directives like #include for incorporating header files and #ifdef for conditional sections based on defined symbols. This facilitates portability and variability in large projects, though it can introduce complexity if overused, as seen in software product lines where preprocessor annotations manage multiple variants from a single . Beyond , preprocessors have historical roots in extending assembly languages since the mid-1950s and continue to influence modern tools. In web development, preprocessors extend stylesheet languages by allowing developers to write in enhanced syntaxes that compile to standard CSS, improving modularity and reusability. Popular examples include Sass (Syntactically Awesome Style Sheets) and Less, which support variables, nesting, mixins, and inheritance to generate efficient CSS output, adopted widely in frameworks like Bootstrap. These tools exemplify preprocessors' role in domain-specific languages, where they bridge expressive authoring environments with production-ready formats, enhancing productivity without altering the underlying runtime. Overall, preprocessors remain essential for , , and adapting languages to diverse requirements.

Fundamentals

Definition and Purpose

A preprocessor is a program that modifies or generates or data before it is fed into a , interpreter, or another primary . Preprocessors vary in approach, with some performing (e.g., tokenization in the ) and others simple text substitution (e.g., in general-purpose tools like m4). In programming contexts, it serves as an initial transformation layer, enabling developers to abstract repetitive or environment-specific elements from the core logic. The primary purposes of a preprocessor include macro expansion, file inclusion, conditional compilation, and text substitution, all aimed at simplifying code maintenance and enhancing portability across different systems. These functions allow for the replacement of symbolic names with their definitions, the integration of external code modules, and the selective processing of code based on predefined conditions, thereby reducing redundancy and facilitating adaptation to varying compilation environments. For instance, in languages like C, the preprocessor plays a crucial role in preparing source files for compilation. In its general , a preprocessor performs text-based transformations such as substitution, inclusion, and conditional processing on the input, producing modified output for subsequent stages; lexical preprocessors like the additionally involve tokenization into units such as keywords, identifiers, and literals before applying replacement rules. This process operates primarily on the textual structure, preserving the overall syntax while altering content through predefined substitutions and inclusions. Unlike compilers, which perform semantic analysis and code generation, preprocessors operate at a higher level of , concentrating on syntactic text without interpreting the program's meaning or logic. This distinction ensures that preprocessors handle preparatory transformations efficiently, delegating deeper validation and optimization to the .

Historical Development

The roots of preprocessors lie in the , emerging from efforts to simplify programming in languages through facilities. 's Autocoder, introduced in 1956 for the IBM 702 and 705 computers, marked an early milestone as one of the first assemblers to support macros, enabling programmers to define reusable code snippets that expanded during to reduce repetition and improve efficiency in low-level coding. The 1960s and 1970s brought preprocessors into high-level languages, driven by the demand for more structured code management. IBM's , first defined in 1964, incorporated a preprocessor supporting definitions, conditional , and file inclusion, drawing from prior systems to create a versatile language for scientific and business applications. In 1972, formalized the during the development of the language at for Unix, initially as an optional tool inspired by file-inclusion features in and ; it began with basic #include and parameterless #define directives, later enhanced by Mike Lesk and John Reiser with argument support and conditionals around 1973. Concurrently, in 1977, and Ritchie created the m4 processor, a general-purpose text tool that gained widespread use in the 1980s for generating code and configurations across Unix environments. The 1980s saw broader adoption and standardization, particularly with C's influence on emerging languages. Bjarne Stroustrup's early C++ implementations from 1979 relied on a (Cpre) to add Simula-like classes to C, facilitating the language's evolution into a full object-oriented system by the mid-1980s. A pivotal milestone came in 1989 with the ANSI X3.159 standard for C, which integrated and specified the preprocessor's behavior, including token pasting and improved portability, ensuring consistent implementation across compilers. By the 2000s, preprocessors had extended to various domains, advancing due to the need for code reusability in large software systems and portability across platforms, allowing abstraction of common patterns to streamline development.

Lexical Preprocessors

C Preprocessor

The C preprocessor is a macro processor that performs initial text manipulation on C source code before compilation, handling tokenization and directive-based operations to facilitate file inclusion, macro substitution, and conditional compilation. It operates as a separate phase in the translation process, transforming the source into a form suitable for the compiler proper, and is integrated into major C and C++ compilers such as GCC and Clang. Key directives in the begin with the # symbol and control its behavior. The #include directive inserts the contents of another file, typically a header, into the at the point of the directive, supporting both angle-bracket forms for system headers and quoted forms for user headers. The #define directive creates , which can be object-like for simple substitutions (e.g., #define PI 3.14159) or function-like with parameters (e.g., #define MAX(a, b) ((a) > (b) ? (a) : (b))). Conditional directives such as #ifdef, #ifndef, #if, #elif, #else, and #endif enable selective inclusion of code based on whether are defined or on constant integer expressions. The # directive issues implementation-defined instructions to the , often for optimization or diagnostic control, while #undef removes prior definitions. Macro expansion replaces an identifier matching a defined with its replacement list, with the preprocessor rescanning the resulting text for further to handle nesting. For function-like macros, arguments are first fully macro-expanded before substitution into the body, after which the entire result is rescanned; special operators include the # for stringification (converting an argument to a ) and ## for pasting (concatenating adjacent ). This process occurs in translation phase 4, ensuring that macro invocations are resolved textually without regard to until after preprocessing. Predefined macros like LINE, FILE, and STDC_VERSION provide compilation context and standard compliance indicators. Common pitfalls in using the include side effects from multiple evaluations of arguments, such as in #define SQUARE(x) ((x)*(x)) where SQUARE(i++) increments i twice unexpectedly. can also cause pollution by defining global identifiers that conflict with program variables or other libraries, leading to subtle bugs across translation units. precedence issues arise without proper parenthesization in bodies, and rescanning rules may yield counterintuitive expansions in complex nested cases. The is standardized in section 6.10 of the ISO/IEC 9899:2011 () specification, which defines its directives, macro rules, and phases, with earlier versions in and C89 providing the foundational model. In C++, the preprocessor largely follows the C standard per ISO/IEC 14882 but includes extensions for compatibility with templates, such as variadic macros introduced in and adopted in , allowing macros with variable argument counts (e.g., #define DEBUG(fmt, ...) printf(fmt, __VA_ARGS__)).

Other Lexical Preprocessors

Assembly language preprocessors provide macro capabilities for simplifying instruction definitions and reducing repetition in low-level code. The (NASM) includes a built-in preprocessor with M4-inspired features, such as single-line macros defined via %define for renaming registers or constants, and multi-line %macro directives for complex instruction sequences, alongside support for conditional with %if and file inclusion via %include. Similarly, the GNU Assembler (GAS) employs .macro and .endm directives to define reusable blocks that expand to instructions, enabling shortcuts like parameterized data movement or loop constructs without external tools. In , preprocessors like fpp address the needs of scientific computing by enabling conditional compilation and parameter substitution to enhance portability across compilers and architectures. The fpp utility, integrated in tools such as the Fortran Compiler and NAG Fortran Compiler, processes directives prefixed by # (e.g., #if for conditionals and #define for macros) to selectively include blocks or replace tokens with computed values, facilitating adaptations for varying or modes. Common Lisp incorporates lexical-level macro systems through reader macros, which expand custom notations during the initial reading phase before full evaluation. The reader algorithm dispatches on macro characters to invoke functions that parse and transform input streams into Lisp objects, such as converting or embedding evaluated expressions, as defined in the language standard. This approach allows early lexical expansions, like defining #|...|# for block comments or #:...# for vectors, directly influencing the structure.

Syntactic Preprocessors

Syntax Customization

Syntax customization preprocessors enable developers to adapt a programming language's surface syntax to better suit domain-specific needs, such as introducing operators in functional paradigms or concise shorthands for repetitive constructs, all while preserving the core semantics of the language. This customization facilitates the creation of tailored notations that improve expressiveness without necessitating changes to the language's or runtime behavior. The primary techniques for achieving syntax customization rely on source-to-source transformations driven by rules. These transformations map extended syntax to equivalent standard constructs before passing the output to the main . A prominent example is found in the Nemerle programming language, where syntax macros provide a mechanism for defining custom . For instance, developers can create macros to define a C-style by transforming the custom syntax into standard constructs, enhancing without altering the executed semantics. Similarly, in , compiler-integrated macros enable code generation to enrich types with additional operations during compilation. The typical process begins with the input —incorporating the custom syntax—into an (AST). Custom rules are then applied to this AST to replace extended forms with semantically equivalent standard syntax, followed by of the transformed AST back into textual for input to the primary . This staged approach ensures that transformations are hygienic and maintain structural integrity. One key advantage of syntax customization preprocessors is their ability to boost code readability and productivity in specialized domains, such as scientific computing or , without the overhead of forking or extending the base language implementation. This modularity allows teams to adopt intuitive notations locally while remaining interoperable with broader ecosystems.

Language Extension

Language extension preprocessors enable the introduction of new constructs to an existing programming language that are absent from its core specification, such as modules for better organization or concurrency primitives for parallel execution, by transforming source code before compilation. This approach allows developers to augment the language's expressiveness without modifying the compiler itself, fostering modular enhancements like trait derivations in systems languages or custom evaluators in functional paradigms. Key techniques in language extension involve (AST) injection or transformation, where the preprocessor parses the input code into an AST, modifies it by inserting or altering nodes to incorporate the new features, and then generates output code that integrates seamlessly with the host language's . To prevent name clashes during expansion, hygienic macros are commonly employed, which maintain lexical scoping by tracking identifier origins through time-stamping and α-conversion, often using generated symbols (gensyms) to ensure uniqueness without accidental variable capture. A prominent example is Rust's procedural macros, which operate at to derive implementations for not natively provided, such as the Serialize trait from the serde ; for instance, applying #[derive(Serialize)] to a struct automatically generates code for serializing its fields into formats like , effectively adding data serialization capabilities to the language. In Lisp dialects, the defmacro facility extends the evaluator by defining new syntactic forms that expand into existing code, allowing users to introduce domain-specific operators or control structures, such as custom , while preserving the language's homoiconic nature. The typical process begins with the preprocessor the source input to identify extension points, applying predefined transformation rules—often via or procedural logic—to inject the new constructs, and finally emitting augmented code that is syntactically and semantically compatible with the target compiler, ensuring the extensions behave as if they were native features. Challenges in implementing these extensions include maintaining , as generated code must pass the host language's type checker without introducing errors, which requires careful validation during transformation to avoid invalid constructs. Additionally, the Turing-completeness of systems can lead to non-terminating expansions or undecidable behaviors, complicating and predictability, though restrictions like expansion limits help mitigate these risks in practice.

Language Specialization

Language specialization preprocessors adapt general-purpose languages by restricting features or tailoring code to specific domains, such as embedded systems or safety-critical software, to generate optimized and constrained output that meets stringent environmental requirements. These tools enforce subsets of the language, eliminating potentially hazardous elements to enhance reliability in resource-limited or high-stakes applications. Key techniques include selective inclusion or exclusion of features through conditional directives, parameterization of components to fit constraints, and application of preprocessing filters that validate and modify input code. For instance, conditional compilation—building on basic mechanisms like #if and #ifdef—allows developers to define macros that activate only domain-appropriate paths, effectively narrowing the scope pre-compilation. Parameterization might involve substituting hardware-specific values into templates, while filters scan for violations and replace or omit unsafe elements, such as dynamic memory allocation in systems. In safety-critical software, tools supporting guidelines, such as static analyzers integrated with preprocessing, help enforce compliance by identifying and addressing unsafe constructs like unrestricted pointer operations or undefined behaviors, ensuring adherence to guidelines like those in :2023. Similarly, in graphics , the GLSL preprocessor specializes shaders for GPU pipelines by using directives to exclude non-essential code paths, tailoring vertex or fragment to hardware stages like transformation or rasterization. The typical process begins with input validation against domain rules, where the preprocessor identifies and processes restricted features—such as removing guarded unsafe code via #if(0) blocks or replacing them with safe alternatives. Unsafe parts are then stripped or substituted, producing output optimized for the target , which compiles only the compliant . These preprocessors improve by preemptively eliminating risky features that could lead to , while boosting performance through reduction of unused code, resulting in smaller binaries and faster execution suited to constrained environments like devices.

General-Purpose Preprocessors

Core Features

General-purpose preprocessors are versatile macro-based tools, such as m4 and GPP, designed for arbitrary text processing independent of any specific programming language. These tools process input text by expanding user-defined macros, enabling the generation of customized output from templates for diverse applications like configuration files or code generation. Core features include support for argument passing to , allowing dynamic substitution of values; in macro definitions to handle iterative processing; conditional for decision-making based on input conditions; file inclusion to embed external content seamlessly; and output diversion to redirect generated text to separate streams for later recombination. In m4, argument passing uses positional references like $1 for the first argument, while GPP supports up to nine arguments with similar digit-based access and evaluates user macro arguments before expansion. enables loops through self-referential macros, conditional relies on primitives like m4_ifelse for string comparisons, file inclusion is handled by m4_include or GPP's #include, and output diversion in m4 uses m4_divert to manage multiple output buffers. Design principles emphasize Turing-complete macro languages, achieved through recursion and conditionals that support complex transformations such as computations and manipulations, while is maintained via scoped variables to avoid name conflicts during expansions. For instance, m4's m4_pushdef and m4_popdef stack definitions temporarily, preserving global scopes and preventing unintended interactions in nested . This scoped approach ensures reliable processing in large-scale templates. Example in m4 illustrate these capabilities: the m4_define establishes substitutions, as in m4_define(greet', Hello, $1!'), which expands greet(world')toHello, world!; m4_ifelseenables [pattern matching](/page/Pattern_matching) and branching, such asm4_ifelse($1', yes', Affirmative', Negative')for conditional output. Loops are implemented via [recursion](/page/Recursion), for example, a [macro](/page/Macro) to [sum](/page/Sum) numbers usingm4_ifelseto check for empty arguments and recursive calls to accumulate values. GPP offers similar [mechanics](/page/Mechanics) with customizable syntax for [macro](/page/Macro) invocation and conditionals like#ifand#ifeq`. These preprocessors enhance portability, particularly in build systems like , where m4 generates platform-specific configuration scripts from abstract templates, adapting code to varying host environments without manual adjustments.

Common Applications

General-purpose preprocessors find widespread application in , where they facilitate the generation of files and Makefiles tailored to diverse platforms. In the GNU Autotools suite, the m4 macro processor plays a central role by expanding macros in configure.ac scripts to produce portable configure scripts that detect features such as headers, libraries, and functions during cross-platform builds. For instance, macros like AC_CHECK_HEADERS and AC_CHECK_FUNCS enable automated detection of platform-specific capabilities, allowing the substitution of variables in template files (e.g., Makefile.in) to create customized Makefiles that ensure consistent builds across systems. This approach, integral to tools like and , supports robust by handling variations in compiler flags, library paths, and dependencies without manual intervention. In and content generation, general-purpose preprocessors serve as template engines to dynamically preprocess files with variables and logic. Jinja, a Python-based templating system, preprocesses templates by replacing placeholders with data, enabling the creation of responsive web pages through Python-like expressions and control structures. Similarly, Mustache functions as a logic-less engine that preprocesses markup for emails and other outputs by expanding simple tags (e.g., {{variable}}) with provided values, promoting separation of presentation from logic and portability across languages like and . These tools streamline the production of personalized content, such as dynamic email campaigns, by processing templates server-side before rendering. Preprocessors also excel in code generation, automating the creation of boilerplate for APIs and data serialization. The Protocol Buffers compiler, protoc, acts as a preprocessor by parsing .proto schema files to generate language-specific code (e.g., in C++, Java, or Python) for efficient serialization and deserialization of structured data. This process eliminates repetitive manual coding for message handling, ensuring type-safe API implementations across distributed systems like those in Google's infrastructure. For documentation purposes, preprocessors enable paradigms that integrate code and explanatory prose into cohesive documents. Noweb, a language-independent tool, preprocesses source files marked with control sequences to extract and tangle code chunks while weaving them into formatted documentation, such as or outputs. By allowing programmers to structure content for human readability—intertwining narrative with executable code—it supports maintainable projects in languages like or , with minimal syntax overhead. Beyond programming, general-purpose preprocessors extend to non-coding domains like text in . LaTeX macros provide a mechanism for document customization by defining reusable commands that automate formatting and content insertion, such as \newcommand for stylized sections or repeated elements in books and journals. In workflows, these macros facilitate scalable text , enabling authors to tailor layouts, equations, and bibliographies without altering core document structure, thus enhancing efficiency in academic and technical output production.

Modern Uses and Challenges

Integration with Modern Languages

In modern programming languages, the role of preprocessors has evolved from standalone tools to integrated compile-time mechanisms, enabling more robust and transformation within the itself. This shift addresses limitations of external preprocessors, such as poor error reporting and textual substitution issues, by embedding directly into language semantics. Functional, web-oriented, and systems languages exemplify this adaptation, favoring hygienic macros and reflection over separate phases for enhanced safety and expressiveness. In functional languages, built-in macros in and represent an advancement from traditional preprocessors to sophisticated systems that support both compile-time and runtime manipulation. macros, defined via defmacro/2, operate on the language's to generate and inject hygienically, avoiding name clashes common in textual preprocessors like those ; for example, the unless macro expands to an if with inverted logic, extending the for custom flows. This integration allows for domain-specific languages (DSLs) and dynamic extensions without a distinct preprocessing step. 's macros similarly treat as , enabling compile-time expansion for constructs like the when , which combines conditional checks with multi-expression bodies, or the threading ->, which rearranges argument positions for readable pipelines; this evolves preprocessor-like substitution into runtime-capable , leveraging the 's for seamless syntax extension. Web and frontend ecosystems rely on preprocessors to augment core languages, compiling enhanced syntax back to standards-compliant output. functions as a preprocessor for by introducing static types, interfaces, and generics—such as defining interface User { name: string; id: number; }—which compile to untyped while providing support and bug prevention through type checking and inference. For CSS, Sass and Less preprocessors add variables, nesting, and mixins to streamline stylesheet management; Sass compiles features like color functions and modular imports to plain CSS, supporting large-scale design systems with reusable blocks, while Less enables arithmetic operations on values (e.g., @base: 5% * 2) and conditional guards, ensuring compatibility with existing CSS parsers. Systems languages like incorporate preprocessor-like evaluation directly into compilation without a separate phase, promoting efficiency and simplicity. Zig's comptime keyword permits arbitrary expressions, including loops and conditionals, to execute at for tasks like generic type construction or array initialization—e.g., comptime var y: i32 = 1;, which guarantees knowledge—allowing through language primitives rather than external tools, with built-in safety checks in modes like ReleaseSafe. Hybrid approaches in and blend preprocessor functionalities with reflective features for syntactic customization. metaclasses serve as alternatives to preprocessors by customizing class creation via the metaclass keyword, overriding methods like __new__ to enforce attributes or behaviors at definition time—e.g., injecting or validation—thus achieving dynamic syntax-like extensions without textual rewriting. 's attribute macros, a type of procedural macro, integrate code transformation into the compilation pipeline; applied as #[derive(AnswerFn)] on structs, they generate implementations like getter methods from token streams, blending phases for type-safe derivations while avoiding the hygiene issues of traditional macros. A broader trend in contemporary languages is the move toward these integrated compile-time features, diminishing reliance on external preprocessors for superior error diagnostics, reduced complexity, and better maintainability; for instance, templates in C++ or static if in replace conditional inclusion, reflecting a preference for native solutions that align with core language design.

Limitations and Alternatives

Preprocessors, while useful for code generation and customization, present several significant limitations that can complicate software development. One primary drawback is the difficulty in debugging expanded code, as the preprocessor transforms the source into a form that often obscures the original structure, making it challenging for debuggers to map issues back to the preprocessed input; for instance, line numbers and breakpoints may not align correctly with the source code under version control. Additionally, security risks arise from unsafe macro definitions, which can lead to unintended side effects such as multiple evaluations of expressions with side effects, potentially enabling vulnerabilities like code injection if macros are misused in untrusted inputs. Performance overhead is another concern, as macro expansions can result in larger intermediate codebases that increase compilation times, and developers often avoid runtime checks in macros to prevent execution slowdowns. Hygiene issues further exacerbate these problems in non-hygienic systems like the , where textual substitution without scoping leads to name capture; this occurs when a inadvertently binds to or shadows identifiers in the surrounding context, causing subtle bugs that are hard to diagnose. For example, a defining a temporary might clash with an existing name in the code it expands into, altering program behavior unexpectedly. Moreover, the text-based nature of preprocessing bypasses type checking, allowing type mismatches or invalid constructs to propagate until later compilation stages, which amplifies error proneness and reduces code reliability. To address these limitations, alternatives such as in C++ offer a more robust approach by performing computations at within the language's , avoiding textual substitution and providing better and debuggability compared to macros. Build tools like enable through scripted templates, separating concerns and allowing for more maintainable preprocessing without embedding logic directly in the source language. Transpilers, such as Babel for , provide higher-level s by converting between similar languages while preserving semantics and enabling features like syntax extensions with improved error reporting. These methods generally offer superior abstraction levels, reducing the risks associated with raw textual manipulation. Preprocessors should be avoided in languages with native metaprogramming facilities, such as , where built-in macros provide hygienic expansion and full language integration at , eliminating the need for an additional preprocessing layer that could introduce inconsistencies. Looking ahead, AI-assisted tools are poised to potentially supplant manual preprocessing by automating the creation of boilerplate and customized code through prompts or , leveraging to enhance development processes while mitigating traditional preprocessor pitfalls.

References

  1. [1]
    Preprocessor | Encyclopedia of Computer Science
    A preprocessor is a language processor that accepts, as input, statements written in one computer language and writes to an output file statements that are ...
  2. [2]
    [PDF] GPP, the Generic Preprocessor - arXiv
    Aug 3, 2020 · In computer science, a preprocessor (or macro pro- cessor) is a tool that programatically alters its input, typically on the basis of inline ...
  3. [3]
    Top (The C Preprocessor)
    ### Definition and Key Functions of the C Preprocessor
  4. [4]
    How preprocessor annotations (do not) affect maintainability
    Preprocessor annotations (e.g., #ifdef in C) enable the development of similar, but distinct software variants from a common code base.
  5. [5]
    CSS preprocessor - Glossary - MDN Web Docs
    Jul 11, 2025 · A CSS preprocessor is a program that lets you generate CSS from the preprocessor's own unique syntax. There are many CSS preprocessors to choose ...
  6. [6]
    Preprocessor - Microsoft Learn
    Aug 3, 2021 · The preprocessor is a text processor that manipulates the text of a source file as part of the first phase of translation.
  7. [7]
    Overview (The C Preprocessor)
    ### Summary of C Preprocessor (cpp) from https://gcc.gnu.org/onlinedocs/cpp/Overview.html
  8. [8]
    Preprocessor directives - IBM
    The preprocessor is a program that is invoked by the compiler to process code before compilation. Commands for that program, known as directives, ...
  9. [9]
    Preprocessor directives - Microsoft Learn
    Preprocessor directives, such as #define and #ifdef , are used to make source programs easy to change and compile in different execution environments.
  10. [10]
    What does preprocessing exactly mean in compiler - Stack Overflow
    Oct 19, 2012 · A preprocessor is a phase that occurs BEFORE any compilation starts. It reads specific macros and symbols to substitute. Its usually one to two pass.
  11. [11]
    [PDF] Contents - David Salomon
    UASAP was later modified to support macros [62]. In the same year another assembler, the IBM Autocoder was developed by R. Goldfinger [10] for use on the ...
  12. [12]
    [PDF] The Development of the C Language - Nokia
    The Development of the C Language. Dennis M. Ritchie. Bell Labs/Lucent Technologies. Murray Hill, NJ 07974 USA dmr@bell-labs.com. ABSTRACT. The C programming ...
  13. [13]
    [PDF] The M4 Macro Processor Brian W. Kernighan Dennis M. Ritchie Bell ...
    This paper is a user's manual for M4. July 1, 1977. †UNIX is a Trademark of Bell Laboratories. Page 2. --. --. The M4 Macro Processor. Brian W. Kernighan.Missing: history | Show results with:history
  14. [14]
    [PDF] A History of C++: 1979− 1991 - Bjarne Stroustrup
    Jan 1, 1984 · In October of 1979 I had a pre− processor, called Cpre, that added Simula− like classes to C run- ning and in March of 1980 this pre− processor ...
  15. [15]
    [PDF] for information systems - programming language - C
    159-1989.) This standard specifies the syntax and semantics of programs written in the C programming language. It specifies the C program's interactions with ...
  16. [16]
    None
    Below is a merged summary of Section 6.10: Preprocessing Directives from the C11 Draft (N1570), consolidating all information from the provided summaries into a concise yet comprehensive response. To retain all details efficiently, I’ll use a structured format with tables where appropriate, followed by narrative text for additional context. The response avoids redundancy while ensuring completeness.
  17. [17]
    Macro Pitfalls (The C Preprocessor)
    ### Summary of Macro Pitfalls from https://gcc.gnu.org/onlinedocs/cpp/Macro-Pitfalls.html
  18. [18]
    [PDF] An update to the preprocessor specification (rev. 2)
    The C preprocessor specification inherited by C++ uses undefined behavior to specify latitude for implementation differences. This technically allows one ...
  19. [19]
    Macro (Using as) - Sourceware
    The commands .macro and .endm allow you to define macros that generate assembly output. For example, this definition specifies a macro sum that puts a sequence ...
  20. [20]
    fpp Preprocessing - Intel
    Intel Fortran Compiler Introduction, Compiler Setup, Compiler Reference, Language Reference, Compilation Program Structure, Optimization and Programming
  21. [21]
    fpp: Fortran preprocessor
    fpp is the preprocessor used by the NAG Fortran compiler. It optionally accepts two filenames as arguments: input-file and output-file are, respectively, the ...
  22. [22]
  23. [23]
    Variables - Sass
    Sass variables are simple: you assign a value to a name that begins with $, and then you can refer to that name instead of the value itself.Missing: nesting | Show results with:nesting
  24. [24]
    Style Rules - Sass
    Nested rules are clever about handling selector lists (that is, comma-separated selectors). Each complex selector (the ones between the commas) is nested ...Property Declarations · Parent Selector · Placeholder Selectors
  25. [25]
    PPI - Parse, Analyze and Manipulate Perl (without perl) - metacpan.org
    The purpose of PPI is not to parse Perl Code, but to parse Perl Documents. By treating the problem this way, we are able to parse a single file containing Perl ...
  26. [26]
    Meta-programming in Nemerle | Request PDF - ResearchGate
    Nemerle [23] provides a powerful syntax extension system based on syntactic macros. For example, we can define a language construct foldfor as follows: ... ...
  27. [27]
    Program Improvement by Source-to-Source Transformation
    A compilation model based on the use of source-to-source program transformations is used to provide a framework for discussing issues of code generation.
  28. [28]
    Scala 3 Macros
    With a macro, we can treat programs as values, which allows us to analyze and generate them at compile time. A Scala expression with type T is represented by an ...
  29. [29]
    [2008.00840] GPP, the Generic Preprocessor - arXiv
    Aug 3, 2020 · In this paper, we present GPP, an extensible, general-purpose preprocessor whose principal advantage is that its syntax and behaviour can be ...
  30. [30]
    Procedural Macros - The Rust Reference
    Procedural macros allow you to run code at compile time that operates over Rust syntax, both consuming and producing Rust syntax.
  31. [31]
    [PDF] Hygienic Macro Expansion (or (ezph (szp)2)
    Eugene Kohlbecker is an IBM Graduate Fellow. This material is based on work supported by the National Science Foundation un- der grants DCR 85-01277 and MCS ...
  32. [32]
    Macros - The Rust Programming Language
    Procedural macros accept some code as an input, operate on that code, and produce some code as an output rather than matching against patterns and replacing the ...
  33. [33]
  34. [34]
    [PDF] Marco: Safe, Expressive Macros for Any Language*
    Some macro systems check whether expanded fragments will pass type checking in the target language. Multi-stage extensions generate safe code in, for ...<|control11|><|separator|>
  35. [35]
    [PDF] How Close is Existing C/C++ Code to a Safe Subset?
    Dec 28, 2023 · Abstract: Using a safe subset of C++ is a promising direction for increasing the safety of the program- ming language while maintaining its ...
  36. [36]
    NASA's 10 Coding Rules Explained: How to Build Reliable and Safe ...
    Oct 27, 2025 · NASA's 10 rules (the “Power of 10”) provide a clear and effective coding standard for critical C software. By avoiding complex constructs ...
  37. [37]
    None
    Below is a merged summary of the GLSL preprocessor and its role in GPU pipeline specialization, consolidating all information from the provided segments into a single, dense response. To maximize detail and clarity, I’ve organized key information into tables where appropriate, while retaining narrative explanations for context. All unique details, including preprocessing directives, purposes, and URLs, are included.
  38. [38]
    [PDF] MISRA C and C++ in OSS: Yes, We Can! - Sched
    Sep 26, 2025 · MISRA view: excluded code should be removed by preprocessing. Linux view: nothing changes if excluded code is removed by later compiler.
  39. [39]
    GNU M4 1.4.20 macro processor - GNU.org
    Kernighan and Ritchie then joined forces to develop the original m4 , described in “The M4 Macro Processor”, Bell Laboratories (1977), https://wolfram. ...
  40. [40]
    GPP - Generic Preprocessor
    gpp is a general-purpose preprocessor with customizable syntax, suitable for a wide range of preprocessing tasks. Its independence on any programming language ...
  41. [41]
    Notes on the M4 Macro Language - Michael Breen
    M4 is a template, macro, or preprocessor language. It processes text using embedded directives, acting as a preprocessor or macro processor.
  42. [42]
  43. [43]
    Jinja — Jinja Documentation (3.1.x)
    - **Description**: Jinja is a template engine used in web development for preprocessing HTML with variables, enabling dynamic content generation using Python-like syntax.
  44. [44]
    {{ mustache }}
    Logic-less templates. Available in Ruby, JavaScript, Python, Erlang, Elixir ... GitHub pages: https://github.com/mustache/mustache.github.com.
  45. [45]
    Protocol Buffers Documentation
    Protocol Buffers are language-neutral, platform-neutral extensible mechanisms for serializing structured data.Go Generated Code GuideC++ Generated Code Guide
  46. [46]
    Noweb home page
    ### Summary of Noweb as a Literate Programming Preprocessor
  47. [47]
    LaTeX Documentation
    ### Summary of LaTeX Macros for Document Customization
  48. [48]
    Are preprocessors obsolete in modern languages? - Stack Overflow
    May 30, 2010 · C's preprocessing can do really neat things, but if you look at the things it's used for you realize it's often for just adding another ...Missing: history | Show results with:history
  49. [49]
    Kernel — Elixir v1.19.2
    Summary of each segment:
  50. [50]
    Clojure - Macros
    ### Summary of Clojure Macros
  51. [51]
    Documentation - TypeScript for JavaScript Programmers
    ### Summary: How TypeScript Acts as a Preprocessor for JavaScript, Augmenting It with Types
  52. [52]
  53. [53]
    Less CSS
    This is the official documentation for Less, the language and Less. js, the JavaScript tool that converts your Less styles to CSS styles.Features · Using Less.js · Less Preview (online... · Tools
  54. [54]
    Documentation - The Zig Programming Language
    Below is a merged summary of the Zig `comptime` documentation, combining all the information from the provided segments into a single, dense response. To maximize detail and clarity, I’ve organized key information into tables where appropriate (in CSV format for density) and retained narrative sections for conceptual overviews and applications. All URLs are consolidated at the end.
  55. [55]
  56. [56]
    The C Preprocessor vs D - D Programming Language
    Macros are unknown to the debugger. Trying to debug a program with symbolic data is undermined by the debugger only knowing about macro expansions, not the ...<|separator|>
  57. [57]
    [PDF] The Love/Hate Relationship with the C Preprocessor: An Interview ...
    One developer clarifies that run-time checks would cause performance overheads when checking for debugging mode, for example. This developer explains that ...
  58. [58]
    Introduction: Why Lisp? - gigamonkeys
    One report that shows Lisp coming out well compared to C++ and Java in the combination of programmer and program efficiency is discussed.
  59. [59]
    (PDF) AI-ASSISTED CODE GENERATION AND OPTIMIZATION
    Aug 8, 2025 · The aim of this paper is to explore the AI-Driven Code Generation and Optimization. The continuous evolution of code generators has also opened up new ...<|control11|><|separator|>