Fact-checked by Grok 2 weeks ago
References
-
[1]
[PDF] Probabilistic Context-Free Grammars (PCFGs) - Columbia CSThe key idea in probabilistic context-free grammars is to extend our definition. to give a probability distribution over possible derivations. That is, we will ...Missing: history papers
-
[2]
[PDF] arXiv:1906.10225v9 [cs.CL] 29 Mar 2020Mar 29, 2020 · A probabilistic context-free grammar (PCFG) consists of a grammar G and rule probabilities π = {πr}r∈R such that πr is the probability of.
-
[3]
[PDF] Chap11 - Probabilistic Context Free GrammarsIn this chapter, we wish to escape the linear tyranny of these n-gram models and HMM tagging models, and to start to explore more complex notions of grammar.Missing: history | Show results with:history
-
[4]
None### Probabilistic Context-Free Grammar (PCFG) Summary
-
[5]
Applying Probability Measures to Abstract LanguagesThe problem of assigning a probability to each word of a language is considered. Two methods are discussed. One method assigns a probability to a word on ...
-
[6]
[PDF] {Probabilistic|Stochastic} Context-Free Grammars (PCFGs)for a PCFG we make use of Inside and Outside probabilities, defined as ... probability of a rule. N j → wk. ): βj(k, k) = P(wk|N j kk. , G). = P(Nj → wk ...
-
[7]
[PDF] Introduction to Automata Theory, Languages, and ComputationHopcroft, John E., 1939-. Introduction to automata theory, languages, and computation / John E. Hopcroft, Rajeev Motwani, Jeffrey D. Ullman.-2nd ed. p. cm. ISBN ...Missing: citation | Show results with:citation
-
[8]
[PDF] Probabilistic Context Free Grammars (PCFGs) 1 Chomsky ... - TTICShow that any PCFG can be put in Chomsky normal form in such a way that it defines the same probability for each string. 4.
-
[9]
[PDF] Probabilistic Context-Free Grammars and beyondProbabilistic grammars can assign non-zero probability to every string, and rely on the probability distribution to distinguish likely from unlikely strings. 27 ...Missing: history origin
-
[10]
[PDF] Weighted and Probabilistic Context-Free Grammars Are Equally ...Necessary conditions and sufficient conditions for a PCFG to be tight are given in several places, including Booth and Thompson (1973) and Wetherell. (1980) ...
-
[11]
[PDF] Weighted and Probabilistic Context-Free Grammars Are Equally ...This article studies the relationship between weighted context-free grammars (WCFGs), where each production is associated with a positive real-valued weight ...
-
[12]
[PDF] Formal Grammars and Markov Models, - DTICThere are close links between probabilistic finite state automata. (and their relation to stochastic formal grammars) and Hidden Markov models. [5]. 3. FORMAL ...
-
[13]
[PDF] Probabilistic Grammars and their Applications - Applied Mathematics3.2 we discuss probabilistic context-free grammars, which turn out to be essentially the same thing as branching processes. Finally, in Sect. 3.3, we take a ...
-
[14]
[PDF] Semiring Parsing - ACL AnthologyWe synthesize work on parsing algorithms, deductive parsing, and the theory of algebra applied to formal languages into a general system for describing ...
-
[15]
[PDF] The Design Principles and Algorithms of a Weighted Grammar LibraryThey may be probabilistic context-free grammars, or more generally weighted context-free grammars. In all cases, the weights play a crucial role in their ...<|control11|><|separator|>
-
[16]
[PDF] Estimation of Consistent Probabilistic Context-free GrammarsAn important property for a proba- bilistic context-free grammar is that it be consistent, that is, the grammar should assign probability of one to the set of ...Missing: seminal | Show results with:seminal
-
[17]
Building a Large Annotated Corpus of English: The Penn TreebankMarcus, Beatrice Santorini, and Mary Ann Marcinkiewicz. 1993. Building a Large Annotated Corpus of English: The Penn Treebank. Computational Linguistics, 19(2): ...
-
[18]
The Return of Lexical Dependencies: Neural Lexicalized PCFGsNov 1, 2020 · Within this context, lexicalized grammar rules are powerful, but the counts available are sparse, and thus required extensive smoothing to ...
-
[19]
[PDF] The estimation of stochastic context-free grammars using the Inside ...The Inside-Outside algorithm assumes that the source can be modelled as a context-free,. Hidden Markov Process (Baker, 1979). The algorithm allows the estimated ...Missing: PCFG | Show results with:PCFG
-
[20]
Trainable grammars for speech recognition - AIP PublishingAug 11, 2005 · This algorithm permits automatic training of the stochastic analog of an arbitrary context free grammar. In particular, in contrast to many ...Missing: trainability | Show results with:trainability
-
[21]
The estimation of stochastic context-free grammars using the Inside ...In this paper, a novel pre-training algorithm is described which can give significant computational savings. Also, the need for controlling the way that non- ...
-
[22]
[PDF] Two Experiments on Learning Probabilistic Dependency Grammars ...Mar 13, 1992 · Glenn Carroll and Eugene Charniak. Department of Computer Science. Brown University. Providence, Rhode Island 02912. CS-92-16. March 1992. Page ...Missing: PCFG | Show results with:PCFG
-
[23]
An Efficient Recognition and Syntax-Analysis Algorithm for Context ...An Efficient Recognition and Syntax-Analysis Algorithm for Context-Free Languages · T. Kasami · Published 11 July 1965 · Computer Science.
-
[24]
Trainable grammars for speech recognition - Semantic ScholarBaker; Published 1 June 1979; Computer Science; Journal of the Acoustical Society of America ... that permits automatic training of the stochastic analog of an arbitrary context free grammar.Missing: trainability | Show results with:trainability
-
[25]
[PDF] Effective Context-free Parsing Models 1. Where do the Grammars ...Rules that are impossible would simply end up with very low probability estimates and could be “pruned” from the grammar after a couple of iterations. But the ...<|control11|><|separator|>
-
[26]
[PDF] Unsupervised Spectral Learning of WCFG as Low-rank Matrix ...We derive a spectral method for unsupervised learning of Weighted Context Free Grammars. We frame WCFG induction as finding a Han- kel matrix that has low rank ...
-
[27]
[PDF] Bayesian Inference of GrammarsLearning probabilistic context-free grammars. Morphological segmentation. Learning from types with Chinese restaurant processes. Adaptor grammars. Bigram ...
-
[28]
[PDF] Data-driven, PCFG-based and Pseudo-PCFG-based Models for ...The key idea is very sim- ple: By converting a dependency structure to a constituency one, we can reuse the PCFGLA ap- proach to learn pseudo grammars for ...
-
[29]
[PDF] Evaluating Two Methods for Treebank Grammar CompactionIn this paper, we explore ways by which a treebank grammar can be reduced in size or `compacted', which involve the use of two kinds of technique: (i) ...
-
[30]
[PDF] Probabilistic Context-Free Grammar Induction Based on Structural ...We present a method for induction of con- cise and accurate probabilistic context- free grammars for efficient use in early stages of a multi-stage parsing ...
-
[31]
[PDF] A Maximum-Entropy-Inspired Parser - ACL AnthologyThis parser uses a maximum-entropy-inspired model, achieving 90.1% average precision/recall for sentences under 40, and 89.5% for under 100, based on a ...
-
[32]
RNA sequence analysis using covariance models - Oxford AcademicRNA sequence analysis using covariance models Open Access. Sean R. Eddy,. Sean R. Eddy *. MRC Laboratory of Molecular Biology. Hills Road, Cambridge CB2 2QH, UK.
-
[33]
Stochastic context-free grammars for tRNA modeling - PubMedNov 25, 1994 · Stochastic context-free grammars (SCFGs) are applied to the problems of folding, aligning and modeling families of tRNA sequences.Missing: Rivas Eddy 2000
-
[34]
[PDF] Treebank-Based Probabilistic Phrase Structure ParsingA hand-crafted system, while in general might return far fewer analyses, has no way of indicating which ones are more likely. This article first introduces ...<|control11|><|separator|>
-
[35]
[PDF] Head-Driven Statistical Models for Natural Language ParsingThe appendices of Collins. (1999) give a precise description of the parsing algorithms, an analysis of their compu- tational complexity, and also a description ...
-
[36]
[PDF] Accurate Unlexicalized Parsing - ACL AnthologyUnlexicalized PCFG parsing uses simple state splits, achieving 86.36% accuracy, and is simpler, more compact, and easier to optimize than lexicalized models.
-
[37]
[PDF] ALP: Data Augmentation Using Lexicalized PCFGs for Few-Shot ...We use lex- icalized PCFG (or L-PCFG) parse trees to consider both constituents and dependencies to capture two very different views of syntax in text data ...
-
[38]
[PDF] Approximating CKY with Transformers - ACL AnthologyDec 6, 2023 · We investigate the ability of transformer mod- els to approximate the CKY algorithm, using them to directly predict a sentence's parse and.<|control11|><|separator|>
-
[39]
Stochastic context-free grammars for tRNA modelingJan 21, 2015 · ABSTRACT. Stochastic context-free grammars (SCFGs) are applied to the problems of folding, aligning and modeling families of tRNA sequences.
-
[40]
Pfold: RNA secondary structure prediction using stochastic context ...The KH-99 algorithm uses a stochastic context-free grammar (SCFG) to produce a prior probability distribution of RNA structures. Given an alignment and a ...Missing: conservation scores
-
[41]
Stochastic modeling of RNA pseudoknotted structuresResults: We introduce a new grammar modeling approach for RNA pseudoknotted structures based on parallel communicating grammar systems (PCGS). Our new approach ...Missing: approximations | Show results with:approximations
-
[42]
Evaluation of several lightweight stochastic context-free grammars ...Jun 4, 2004 · Nine different small SCFGs were implemented to explore the tradeoffs between model complexity and prediction accuracy.
-
[43]
Software - Eddy LabCOVE download. Author: Sean Eddy. Covariance models of RNA secondary structure (old version). COVE is an implementation of stochastic context free grammar ...
-
[44]
tRNAscan-SE: A Program for Improved Detection of Transfer RNA ...We describe a program, tRNAscan-SE, which identifies 99–100% of transfer RNA genes in DNA sequence while giving less than one false positive per 15 gigabases.Missing: RNABOB PCFG
-
[45]
Infernal 1.0: inference of RNA alignments - Oxford AcademicAbstract. Summary: infernal builds consensus RNA secondary structure profiles called covariance models (CMs), and uses them to search nucleic acid sequence.
-
[46]
rMSA: A Sequence Search and Alignment Algorithm to Improve RNA ...rMSA improves covariance-based RNA secondary structure prediction accuracy by ≥20%. •. It improves deep learning-based contact prediction accuracy by ≥5%.
-
[47]
Predicting Protein Secondary Structure Using Stochastic Tree ...We propose a new method for predicting protein secondary structure of a given amino acid sequence, based on a training algorithm for the probability parame.
-
[48]
Probabilistic grammatical model for helix‐helix contact site ...Dec 18, 2013 · In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of ...
-
[49]
Estimating probabilistic context-free grammars for proteins using ...Mar 18, 2019 · ... multiple sequence alignment of the proteins, these models are unable to capture dependencies between the protein residues. Non-local ...
-
[50]
Estimating probabilistic context-free grammars for proteins using ...Mar 18, 2019 · A probabilistic context-free grammar (PCFG) is a quintuple G = 〈 Σ , V , v 0 , R , θ 〉 , where θ is a finite set of probabilities of rules: θ ...Missing: seminal | Show results with:seminal
-
[51]
Harnessing deep learning for proteome-scale detection of amyloid ...Jul 15, 2025 · In this piece of research, we use an ensemble of six PCFG in the Chomsky Form with Contacts, which is suited to efficiently represent pairwise ...