BASIC
BASIC (Beginner's All-purpose Symbolic Instruction Code) is a family of high-level, interpreted programming languages designed for simplicity and accessibility, enabling beginners and non-technical users to write and execute code with minimal complexity.[1] Developed in 1964 by mathematicians John G. Kemeny and Thomas E. Kurtz at Dartmouth College in Hanover, New Hampshire, it was created to democratize computing for students across disciplines, particularly those outside mathematics and science, through an interactive time-sharing system.[2][3] The language launched on May 1, 1964, at 4:00 a.m. Eastern Time on a GE-225 mainframe computer as part of the Dartmouth Time-Sharing System (DTSS), marking the first widespread use of time-sharing for general-purpose programming.[4] Featuring a straightforward syntax with just 15 core commands—such as PRINT, GOTO, and IF—BASIC emphasized line-by-line execution and immediate feedback, making it ideal for educational settings and rapid prototyping.[1][4] Its impact extended far beyond academia, profoundly shaping personal computing in the 1970s and 1980s by becoming the default language on early microcomputers like the Altair 8800, Apple I, Commodore 64, and IBM PC.[2][4] A pivotal adaptation was Microsoft BASIC, developed by Bill Gates and Paul Allen in 1975 for the Altair, which fueled the company's growth and embedded BASIC in ROM on millions of home computers, enabling widespread software experimentation and hobbyist innovation.[5] By 1967, over 2,000 Dartmouth students had learned to program in BASIC, and its adoption spread globally, introducing computing to diverse fields like business, education, and engineering.[3][2] Over decades, BASIC evolved into numerous dialects, including Dartmouth BASIC, Tiny BASIC for resource-constrained systems, QBASIC (bundled with MS-DOS), and Visual Basic for graphical applications, though it faced criticism for promoting unstructured "spaghetti code" via heavy reliance on GOTO statements.[1] Standardized efforts, such as ANSI Minimal BASIC in 1978 and Full BASIC in 1987, aimed to unify variants, but its influence persists in modern languages like Python and Java through shared emphasis on readability and beginner-friendliness.[1] Today, legacy implementations like VBA (Visual Basic for Applications) remain in use for automation in tools such as Microsoft Office, underscoring BASIC's enduring role in making programming approachable.[4]History
Origins at Dartmouth College
BASIC, or Beginner's All-purpose Symbolic Instruction Code, was developed in 1964 by mathematicians John G. Kemeny and Thomas E. Kurtz at Dartmouth College to make computing accessible to a broad range of students, including those in the arts, sciences, and humanities, rather than limiting it to science and mathematics majors.[2][6] The primary goals were to create a language emphasizing simplicity, interactivity, and an English-like syntax that contrasted with the more complex and formulaic structure of languages like FORTRAN, thereby democratizing programming for non-experts.[7][8] The first implementation occurred on the Dartmouth Time-Sharing System (DTSS), a custom operating system built for a GE-225 mainframe computer, which allowed multiple users to interact with the machine simultaneously via remote terminals.[9][6] This initial version of Dartmouth BASIC supported floating-point arithmetic and basic operations, reflecting its focus on foundational computing concepts accessible to beginners without advanced numerical precision.[2] Key features included line-numbered statements for organizing code, an immediate execution mode for instant feedback on commands, and a small set of keywords such as LET for assignments, PRINT for output, and IF for conditional logic.[7][2] On May 1, 1964, at 4 a.m., Kemeny and a student programmer successfully ran the first BASIC programs simultaneously from separate terminals in the basement of Dartmouth's College Hall, marking the language's operational debut and confirming the viability of its time-sharing approach.[9][8] This original Dartmouth BASIC laid the groundwork for later dialects, including Microsoft BASIC, by establishing core principles of accessibility and ease of use.[6]Early Spread on Time-Sharing Systems
Following the initial development at Dartmouth College, BASIC quickly spread to commercial time-sharing systems in the late 1960s, adapting its foundational principles of simplicity and interactivity to multi-user environments. Digital Equipment Corporation (DEC) played a key role by integrating BASIC-like capabilities into its PDP-8 minicomputers, which were popular for educational and small-scale computing. DEC's FOCAL interpreter, released in 1969, served as an early analog to BASIC on the PDP-8 family, enabling formula-based calculations in time-sharing setups like the TSS-8 operating system, which supported multiple virtual 4K machines for concurrent users.[10] By the early 1970s, DEC's own BASIC-8 dialect extended this further, running under TSS-8 to facilitate remote programming sessions on PDP-8 systems marketed through the EduSystem program, thus broadening access beyond academic labs.[10] Hewlett-Packard advanced BASIC's time-sharing adoption with the HP 2000A system in 1968, built around the HP 2116 processor and dedicated to a multi-user BASIC interpreter. This implementation supported up to 16 simultaneous terminals, emphasizing conversational programming for education and problem-solving, with automatic floating-point arithmetic to handle results beyond six-digit integers, such as converting large computations to scientific notation (e.g., 1.698990E9).[11][12] The system's disc-based storage allowed for 250,000 words of program libraries, enabling users to save and share code efficiently in real-time sessions.[12] General Electric (GE) further propelled BASIC's proliferation through its collaboration on the Dartmouth Time-Sharing System (DTSS), operational by 1965 on GE-225 and GE-635 hardware with a Datanet-30 controller. GE's BASIC implementation supported up to 32 remote teletype terminals, compiling and executing programs in 1-4 seconds while handling floating-point operations via the GE-235.[13] This setup influenced widespread adoption in education for undergraduate instruction and faculty research, as well as in business for general-purpose data processing, by providing a user-friendly interface with commands like HELLO for login and SAVE for program storage.[13] These time-sharing systems democratized programming by offering low-cost remote access via telephone lines, with terminal connections as affordable as $7 per hour for light use on platforms like the HP 2000.[14] Such pricing, combined with no additional line charges for local terminals, allowed students, educators, and small businesses to engage in interactive coding without owning expensive hardware, fostering BASIC's growth from an academic tool to a commercial staple.[12] However, early time-sharing BASIC dialects faced significant constraints due to limited hardware, particularly on systems like the PDP-8 with only 4K words (approximately 6KB) of core memory per virtual machine.[10] These restrictions resulted in minimalistic implementations, often lacking advanced features like subroutines in initial versions to fit within memory budgets, prioritizing basic arithmetic, loops, and input/output for short numerical tasks.[10]Adoption on Minicomputers
In the early 1970s, BASIC began transitioning from time-sharing environments to standalone minicomputers, enabling local execution on dedicated hardware for small businesses and emerging hobbyist applications. Digital Equipment Corporation (DEC) played a pivotal role with the release of BASIC-PLUS in 1970 for its PDP-11 series, an extended dialect that introduced robust string handling capabilities and direct disk input/output operations to support text processing and file management tasks.[15] This version enhanced BASIC's utility for commercial and educational computing on the PDP-11, a 16-bit minicomputer launched that year, by allowing efficient manipulation of alphanumeric data without relying on remote systems.[16] A landmark development occurred in 1975 when Bill Gates and Paul Allen created the 4K BASIC interpreter for the MITS Altair 8800, the first commercially successful personal computer kit based on the Intel 8080 microprocessor. This compact version, fitting within 4 kilobytes of memory, marked the founding of Microsoft and was distributed via cassette tapes or paper tape, making it accessible for hobbyists to load and run programs on affordable hardware kits priced around $397.[17] The Altair BASIC emphasized simplicity for experimentation, with core features like line-numbered statements and basic arithmetic, fostering early software distribution models in the minicomputer era. Other vendors developed hardware-specific BASIC dialects to meet business needs. Wang Laboratories extended BASIC for its VS series minicomputers, introduced in 1977, incorporating interactive extensions for workstation-based data entry and report generation to streamline office automation tasks such as formatted output and validation.[18] These enhancements, including screen formatting and file integration, positioned Wang VS BASIC as a tool for small business applications like inventory tracking and document processing on systems supporting multiple users. The adoption of BASIC on minicomputers was driven by falling hardware costs, with systems like Computer Automation's Alpha LSI-2 available for under $2,000 in 1972 and IBM's Series/1 starting at $10,000 by 1976, allowing small businesses to perform on-site programming without the recurring fees of time-sharing services.[19] This affordability, combined with BASIC's ease of use, enabled non-expert users in engineering firms and hobbyist clubs to develop custom applications for data acquisition and process control, expanding computing beyond large institutions.[20] Technical evolutions in these dialects addressed the demands of larger programs on minicomputers, with variants incorporating structured elements such as the GOSUB statement for subroutine calls and RETURN for modular code organization, as seen in implementations derived from Dartmouth BASIC.[21] These features improved program maintainability on systems like the PDP-11, where memory constraints necessitated efficient control flow without excessive GOTOs.[15]Explosive Growth in Home Computing
The explosive growth of home computing in the late 1970s was catalyzed by the introduction of affordable, fully assembled microcomputers that featured BASIC as a built-in programming language, making coding accessible to hobbyists, educators, and families without requiring specialized expertise. In 1977, three landmark systems—the Commodore PET, Apple II, and TRS-80—collectively known as the "1977 Trinity," ignited this surge by offering turnkey solutions with BASIC interpreters stored in read-only memory (ROM), allowing users to begin programming immediately upon powering on. These machines drew from earlier minicomputer practices of memory-efficient coding to fit within limited resources, enabling widespread adoption in homes and schools.[22] The Commodore PET, released in 1977, exemplified this trend with its 8 KB version including a built-in 6502 microprocessor, monochrome display, full keyboard, and cassette drive, all integrated into a single metal-cased unit priced at around $795; its 4 KB or 8 KB ROM-based BASIC enabled instant programming for tasks like simple games and data entry. Similarly, the Apple II, also launched in 1977 and designed by Steve Wozniak, incorporated Integer BASIC in ROM, supporting color graphics primitives and sound generation through its expansion slots and TV output, which facilitated creative applications such as educational software and early multimedia experiments. The TRS-80 Model I, introduced by Radio Shack in the same year for $599.95, came with 4 KB RAM and Level I BASIC in 4 KB ROM, later upgradable to Level II BASIC with extensions for disk storage and basic graphics, appealing to budget-conscious consumers through widespread retail availability.[23][24][25] Building on this momentum, Atari's 400 and 800 models, released in 1979, further expanded BASIC's reach with dialects that leveraged custom chips for advanced graphics (via ANTIC and GTIA) and sound (via POKEY), alongside four joystick ports for interactive programming of games and simulations; Atari BASIC, typically provided on a cartridge or disk, integrated these hardware features through dedicated commands. By 1985, the home computer market had exploded, with over 10 million units sold cumulatively—many pre-installed with BASIC in ROM—empowering non-programmers to create custom games, utilities, and household applications, and fostering a vibrant ecosystem of user-shared software that democratized computing.[26][27] Preceding this boom, variants like Tiny BASIC, developed by Li-Chen Wang in 1976 and published in Dr. Dobb's Journal, targeted ultra-low-memory machines with just 2-4 KB of RAM, offering a compact interpreter that promoted open-source sharing through magazines such as Byte, where enthusiasts exchanged code listings to adapt it for emerging hardware. This grassroots dissemination laid the groundwork for BASIC's ubiquity, as it encouraged experimentation and community-driven innovation in the constrained environments of early home systems.[28]Evolution on Personal Computers
The introduction of the IBM Personal Computer in 1981 marked a pivotal shift for BASIC, transitioning it from the diverse 8-bit home computer landscape to the standardized 16-bit IBM PC and compatible ecosystem running MS-DOS. IBM bundled Microsoft's Cassette BASIC in ROM for basic functionality, but for disk-based systems, it included BASICA (Basic Advanced), a disk-resident interpreter that extended the ROM BASIC with additional commands for file handling and disk operations. BASICA was provided as OEM software with PC-DOS, enabling compatibility across IBM's open architecture and fostering BASIC's dominance in early business and hobbyist programming on the platform.[29][30] Microsoft further solidified its position with GW-BASIC, released in 1983 as a standalone interpreter for non-IBM MS-DOS systems, derived directly from BASICA to ensure binary compatibility. GW-BASIC supported advanced graphics modes, including those for CGA and later EGA adapters, allowing programmers to leverage the PC's expanding hardware capabilities for color and high-resolution output—building on simpler graphics primitives from earlier home computers. This version was bundled with MS-DOS distributions for IBM PC compatibles, promoting portability and ease of use in the growing DOS environment, where programs shifted from cassette tapes to floppy disks and hard drives, accommodating larger codebases up to the system's 640 KB RAM limit.[30][31] By the late 1980s, competition intensified with Borland's Turbo BASIC in 1987, a compiler that emphasized speed and efficiency, compiling programs faster than Microsoft's offerings while supporting inline assembly for performance-critical sections. This tool appealed to developers seeking optimized executables without sacrificing BASIC's accessibility. Meanwhile, BASIC played a key role in early PC software ecosystems, such as extending VisiCalc—the landmark spreadsheet—with custom macros and functions written in BASIC to add features like advanced mathematical operations.[32][33] Microsoft's QBasic, introduced in 1991 with MS-DOS 5.0, represented the culmination of this command-line era for PC BASIC interpreters, featuring an integrated full-screen editor for streamlined program development and enhanced file handling capabilities. QBasic maintained backward compatibility with GW-BASIC while incorporating structured programming elements like improved error handling, reflecting the maturing DOS platform and preparing users for more sophisticated dialects. These evolutions underscored BASIC's adaptability to the IBM PC standard, driving its widespread use in education, utilities, and small applications throughout the 1980s.[34]Post-1990 Developments and Dialects
Following the shift toward graphical user interfaces in the early 1990s, Microsoft introduced Visual Basic 1.0 in May 1991, marking a significant evolution in BASIC by integrating drag-and-drop form designers and event-driven programming specifically for Windows 3.0 applications.[35] This release transformed BASIC from a primarily text-based scripting tool into a rapid application development environment, enabling developers to visually assemble user interfaces and respond to user events like button clicks without extensive manual coding. Over a decade later, Microsoft launched VB.NET in 2002 as an integral part of the .NET Framework, fundamentally redesigning the language with comprehensive object-oriented programming features such as classes, inheritance, and polymorphism.[36] VB.NET incorporated automatic garbage collection for memory management and supported cross-language interoperability within the .NET ecosystem, allowing seamless integration with languages like C# and leveraging shared libraries for enhanced scalability in enterprise software.[37] These advancements positioned VB.NET as a bridge between classic BASIC simplicity and modern, robust application development.[38] In parallel, open-source efforts emerged to revive and extend legacy BASIC dialects for contemporary hardware. FreeBASIC, first released in 2004, serves as an open-source compiler that maintains syntax compatibility with QBASIC while generating native executables for Windows, Linux, and other platforms, including support for pointers, inline assembly, and object-oriented extensions.[39] Similarly, QB64, developed in the mid-2000s, functions as a self-hosting compiler and emulator that preserves QBASIC's syntax and commands—such as those from the DOS-era QBASIC—while compiling to 64-bit binaries for modern operating systems like Windows, macOS, and Linux, facilitating the porting of vintage games and utilities.[40] Specialized dialects also proliferated for targeted domains. PureBasic, originating in the late 1990s, focuses on cross-platform application development across Windows, Linux, macOS, and Raspberry Pi, offering a clean syntax with built-in support for GUI creation, database access, and networking without requiring external libraries.[41] BlitzBasic, likewise introduced in the 1990s, targeted game development with an emphasis on rapid prototyping, incorporating 3D graphics acceleration, collision detection, and multimedia handling to streamline the creation of real-time applications on platforms like Amiga and later Windows.[42] As of 2025, BASIC dialects continue to evolve in niche contexts despite a broader decline in mainstream adoption. Gambas, a Linux-oriented IDE and runtime that emulates Visual Basic syntax, saw its version 3.20.0 released in January 2025, featuring improved GUI theming, enhanced web application debugging, and better integration with desktop environments like KDE and GNOME.[43] While BASIC has receded from general-purpose software development in favor of languages like Python and JavaScript, it persists in embedded systems for tasks such as microcontroller programming and simple automation, where its straightforward syntax aids quick prototyping in resource-constrained environments.[44]Language Features
Syntax Fundamentals
BASIC employs a line-based structure for program organization, where each line consists of a unique integer number followed by a single statement. This design, introduced in the original Dartmouth implementation, ensures that statements are executed in ascending numerical order unless altered by control mechanisms, facilitating simple sequential processing. Line numbers, ranging from 1 to 99999, also act as labels for referencing during editing or flow redirection, with common practice incrementing by 10 (e.g., 10, 20) to accommodate insertions without renumbering.[45] The fundamental format of a BASIC statement begins with a keyword denoting the operation, such as assignment or output, followed by any necessary operands, expressions, or parameters. Statements are concise and English-like, promoting readability for beginners, and must fit within a single line unless extended in later dialects. Every program requires an END statement as the final line to terminate execution and return control to the system monitor.[21] BASIC distinguishes between direct mode and program mode to support interactive use. In direct mode, unnumbered statements entered at the prompt are executed immediately upon input, enabling quick computations or diagnostics without storing code. Numbered statements, conversely, accumulate in program mode to form a complete script, which is then run holistically via the RUN command, allowing for persistent, repeatable execution.[46] Early BASIC variants provided rudimentary error handling, where runtime errors like division by zero would interrupt execution and display an error message, reverting to the monitor without trapping. Extended dialects, starting with implementations like those from Microsoft in the late 1970s, added the ON ERROR GOTO construct to intercept errors and transfer control to a designated line for custom recovery or logging.[47] From its unstructured beginnings reliant on unconditional jumps, BASIC gradually incorporated optional structured elements in dialects from the 1980s onward. Versions such as True BASIC and ANSI-standard extensions introduced block-delimited constructs, including multi-line conditional statements ending with END IF, to encourage modular design while maintaining compatibility with legacy line-numbered code.[48]Data Types and Variables
BASIC, designed for accessibility, employs implicit typing where the data type of a variable is inferred from its name rather than requiring explicit declarations, a feature that simplifies programming for beginners. In the original Dartmouth BASIC, variables are numeric by default and consist of a single uppercase letter optionally followed by a digit (e.g., A or X5), supporting up to 286 possible names without suffixes for types. Strings, introduced later in this dialect, are denoted by appending a dollar sign () to the variable name (e.g., A), distinguishing them from numeric variables.[49] Subsequent dialects, particularly those from Microsoft such as GW-BASIC, expanded this system with type suffixes to specify precision and category, while retaining implicit typing as the default mechanism. Numeric variables without a suffix default to single-precision floating-point (4 bytes, sufficient for most basic calculations), but can be explicitly typed as integers (%) for whole numbers ranging from -32,768 to 32,767 (2 bytes storage), double-precision (#) for higher accuracy (8 bytes), or single-precision (!) explicitly. String variables end with $ and are limited to 255 characters in early implementations, stored with variable-length allocation plus overhead. This suffix convention allows flexible naming—up to 40 characters in Microsoft BASIC—while embedding type information directly, reducing errors for novice users. DEF statements (e.g., DEFINT A-Z) further customize defaults across name ranges, overriding suffixes where applicable.[50] Arrays in BASIC extend scalar variables into collections, declared using the DIM statement to allocate memory and define dimensions, promoting structured data handling without complex syntax. In Dartmouth BASIC, arrays (called lists or tables) are one- or two-dimensional, with subscripts in parentheses (e.g., A(I) for a list, B(I,J) for a matrix); without DIM, the range defaults to 0 through 10, but larger sizes require explicit dimensioning like DIM A(25). Microsoft dialects maintain this with DIM supporting up to 255 dimensions and 32,767 elements per dimension, using zero-based indexing by default (e.g., DIM C(10) accesses C(0) to C(10)); the OPTION BASE 1 statement shifts to one-based indexing for compatibility with mathematical conventions. Arrays follow the same typing rules as scalars, inheriting suffixes for elements (e.g., DIM D$(5) for a string array).[49][50] Variable scope in BASIC is global by default, meaning all variables and arrays are accessible throughout the program unless chained to another via COMMON statements in dialects like GW-BASIC, facilitating simple, linear code structures ideal for education. Later structured variants, such as Visual Basic, introduce LOCAL declarations within procedures to limit scope to that subroutine, enhancing modularity while preserving global access via DIM or PUBLIC for broader use. This evolution balances beginner simplicity with advanced control.[50][51] Type conversion in BASIC relies on built-in functions for explicit casting between types, addressing implicit mismatches and preventing runtime issues like overflow. The CINT function, common in Microsoft dialects, rounds a numeric expression to the nearest integer within -32,768 to 32,767, truncating the decimal (e.g., CINT(45.67) yields 46); exceeding limits triggers an overflow error (code 6). Similar functions include CSNG for single-precision and CDBL for double-precision, with string-to-numeric conversions like CVI handling binary data. These tools ensure safe interoperability, though early dialects like Dartmouth BASIC offered fewer options, relying on basic assignment.[50]Control Structures and Program Flow
In early implementations of BASIC, such as the original Dartmouth version developed by John G. Kemeny and Thomas E. Kurtz in 1964, program flow relied heavily on unstructured mechanisms like unconditional jumps. The GOTO statement transfers control directly to a specified line number, allowing arbitrary branching within the code.[21] This was essential for creating loops and decisions in the absence of more structured alternatives, though it later drew criticism for complicating program readability. Subroutines were supported via the GOSUB statement, which jumps to a designated line number while saving the return address on a stack, paired with the RETURN statement to resume execution at the calling point.[21] This enabled modular code organization without full procedures, with GOSUB syntax simplyGOSUB line_number and RETURN as a standalone command.
Conditional branching was introduced through the IF statement, which evaluates a relational expression and executes a GOTO if true, typically in a single-line format like IF condition THEN line_number.[21] Later dialects, such as Microsoft BASIC 7.1 (1990), extended this to multi-line blocks with ELSE for handling false conditions, as in IF condition THEN statements ELSE statements.[52] This evolution supported more readable if-then-else logic, where variables could be referenced in conditions to compare numeric or string values.[52]
Looping constructs began with FOR-NEXT in original BASIC, where FOR variable = start TO end [STEP increment] initializes a counter, and NEXT variable increments it (default step of 1) until exceeding the end value.[21] For instance, a simple factorial calculation might use:
This outputs 120, demonstrating iterative multiplication. Structured dialects like Microsoft BASIC introduced WHILE-WEND for condition-checked repetition (10 LET F = 1 20 FOR I = 1 TO 5 30 LET F = F * I 40 NEXT I 50 PRINT F10 LET F = 1 20 FOR I = 1 TO 5 30 LET F = F * I 40 NEXT I 50 PRINT F
WHILE condition ... WEND) and DO-LOOP variants (DO [WHILE|UNTIL condition] ... LOOP or post-tested equivalents), providing indefinite loops without fixed counters.[52]
Multi-way selection was handled by the ON statement, which evaluates an integer expression and branches to one of several line numbers via ON expression GOTO/GOSUB line1, line2, ..., with the index determining the target (values beyond the list often defaulting to the last).[21] This facilitated switch-like behavior in both early and later BASIC variants.[52]
Input/Output Operations
Input/output operations in BASIC were designed to support its interactive, beginner-friendly nature, emphasizing simple commands for screen display and user prompts in the original Dartmouth implementation. The PRINT statement outputs values or strings to the terminal, with comma-separated items advancing to fixed-width zones (typically 15 characters) and semicolon-separated items continuing on the same line without zone advancement. For instance, in early Dartmouth BASIC, the statementPRINT X1, X2 would display the variables side by side in zoned format, while PRINT "NO UNIQUE SOLUTION" outputs a literal message followed by a newline.[21][53] Later dialects introduced the PRINT USING statement for more precise formatting, allowing templates with symbols like # for digits and . for decimals to control output appearance, such as PRINT USING "###.##"; PI to display pi rounded to two decimal places.[54]
User input via the INPUT statement prompts the terminal for values, assigning them to specified variables and often displaying a question mark or custom prompt. Syntax like INPUT A pauses execution until the user enters a numeric value for A, with multiple variables allowed (e.g., INPUT X, Y), though early versions required the LET keyword for assignments and could halt on data type mismatches, such as entering text for a numeric variable.[21][53] The READ statement, paired with DATA statements, provided internal data input without user interaction, sequentially assigning constants from DATA lines to variables (e.g., 10 READ A1, A2 followed by 20 DATA 1, 2), advancing a pointer through the data pool until exhaustion, at which point execution typically stopped.[53]
File handling emerged in extended dialects for minicomputers and personal systems, enabling persistent data storage beyond terminal sessions. The OPEN statement establishes access to a file, specifying mode (INPUT for reading, OUTPUT for writing/overwriting, or APPEND for adding), filename, and file number (an integer identifier, e.g., 1–255). Reading occurs with INPUT # or LINE INPUT # for delimited or whole-line strings, while writing uses PRINT # or WRITE # to output formatted or comma-separated data. Files are closed with CLOSE # to free the file number and flush buffers.[55] For example, a basic read loop might appear as:
This iterates until the end-of-file condition, detected by the EOF function, which returns true (-1 in some dialects) when no more data is available, preventing "Input Past End" errors.[55][56] String/number mismatches during INPUT could trigger runtime errors, requiring careful variable typing, while EOF checks were essential for robust loops over variable-length files.[56] Some dialects added device-specific I/O for peripherals like printers, using PRINT # with a device file number or commands like LPRINT for direct printer output. Non-blocking input appeared in variants such as QBASIC with INKEY$, which returns a single keypress string immediately (or empty if none pressed), facilitating responsive applications without pausing execution.[57] These features, while varying across implementations, underscored BASIC's adaptability to hardware constraints while maintaining simplicity.OPEN "data.txt" FOR INPUT AS #1 DO WHILE NOT EOF(1) LINE INPUT #1, S$ PRINT S$ LOOP CLOSE #1OPEN "data.txt" FOR INPUT AS #1 DO WHILE NOT EOF(1) LINE INPUT #1, S$ PRINT S$ LOOP CLOSE #1
Mathematical and Miscellaneous Functions
BASIC included a core set of arithmetic functions designed for straightforward numerical computations, drawing from the original Dartmouth implementation. The ABS function returns the absolute value of its argument, such as ABS(-5) yielding 5, while SGN returns the sign of the argument as 1 for positive, 0 for zero, and -1 for negative values.[53] These functions accepted numeric variables or expressions as inputs, enabling operations like determining magnitudes in simple algorithms.[53] The square root function, SQR, computed the principal square root of a positive argument, as in SQR(25) returning 5. A practical example was its use in the Pythagorean theorem, where the hypotenuse could be calculated asHYP = [SQR](/page/SQR)(A*A + B*B) for legs A and B, illustrating BASIC's emphasis on accessible geometric computations.[53]
Trigonometric functions in the original Dartmouth BASIC operated exclusively in radians, aligning with standard mathematical conventions for computational efficiency. SIN(X), COS(X), and TAN(X) provided the sine, cosine, and tangent of argument X, respectively; for instance, SIN(π/2) returned 1. Users converting from degrees performed manual adjustments, such as multiplying degrees by π/180 to obtain radians, due to the absence of built-in conversion utilities.[7]
For random number generation, the RND function produced a pseudo-random value between 0 and 1, with the argument ignored in early implementations but later dialects allowing seeding via the RANDOMIZE statement to initialize the generator, such as RANDOMIZE TIMER for time-based variability.[58] This facilitated simulations and games without requiring advanced seeding knowledge.
String manipulation functions emerged prominently in dialects like Microsoft BASIC, supporting text processing for beginners. LEN(S) returned the number of characters in string S, as in LEN("HELLO") yielding 5; LEFT(S, N) extracted the first N characters, such as LEFT("BASIC", 3) returning "BAS"; and MID(S, I, J) retrieved J characters starting from position I, for example MID("PROGRAM", 4, 3) producing "GRA".[58]
Miscellaneous functions provided system-level utilities in later dialects. DATE and TIME returned the current system date and time as strings, enabling timestamped outputs like PRINT DATE$; while USR(X) invoked user-defined machine code routines with argument X, allowing low-level extensions in resource-constrained environments.[58] These features underscored BASIC's practicality for both educational and applied programming tasks.