"Symbolic regression"의 두 판 사이의 차이
		
		
		
		
		
		둘러보기로 가기
		검색하러 가기
		
				
		
		
	
Pythagoras0 (토론 | 기여)  | 
				Pythagoras0 (토론 | 기여)   (→메타데이터:  새 문단)  | 
				||
| (같은 사용자의 중간 판 하나는 보이지 않습니다) | |||
| 47번째 줄: | 47번째 줄: | ||
===Spacy 패턴 목록===  | ===Spacy 패턴 목록===  | ||
* [{'LOWER': 'decision'}, {'LEMMA': 'tree'}]  | * [{'LOWER': 'decision'}, {'LEMMA': 'tree'}]  | ||
| + | |||
| + | == 노트 ==  | ||
| + | |||
| + | ===말뭉치===  | ||
| + | # A decision boundary found with symbolic regression.<ref name="ref_6bfa8ac5">[https://turingbotsoftware.com/ Symbolic regression software]</ref>  | ||
| + | # Many free symbolic regression packages have been developed in the past, including notably gplearn but also many other small repositories that can be found on GitHub.<ref name="ref_6bfa8ac5" />  | ||
| + | # No particular model is provided as a starting point for symbolic regression.<ref name="ref_dc3afb26">[https://en.wikipedia.org/wiki/Symbolic_regression#:~:text=Symbolic%20regression%20(SR)%20is%20a,terms%20of%20accuracy%20and%20simplicity. Symbolic regression]</ref>  | ||
| + | # This means that it will possibly take a symbolic regression algorithm longer to find an appropriate model and parametrization, than traditional regression techniques.<ref name="ref_dc3afb26" />  | ||
| + | # All symbolic regression problems use an arbitrary data distribution, and try to fit the data with the most accurate symbolic formula available.<ref name="ref_5dd19b08">[https://deap.readthedocs.io/en/master/examples/gp_symbreg.html Symbolic Regression Problem: Introduction to GP — DEAP 1.3.3 documentation]</ref>  | ||
| + | # As any evolutionary program, symbolic regression needs (at least) two object types : an individual containing the genotype and a fitness.<ref name="ref_5dd19b08" />  | ||
| + | # In a symbolic regression optimization, it is important to discard a large formula if a smaller one with the same accuracy is encountered.<ref name="ref_dd72bcb8">[https://towardsdatascience.com/symbolic-regression-the-forgotten-machine-learning-method-ac50365a7d95 Symbolic Regression: The Forgotten Machine Learning Method]</ref>  | ||
| + | # These results open up new opportunities to explain symbolic regression models compared to the approximations provided by model-agnostic approaches.<ref name="ref_ce49c156">[https://dl.acm.org/doi/10.1145/3449639.3459302 Measuring feature importance of symbolic regression models using partial effects]</ref>  | ||
| + | # On the other hand, one needs to sample the equation search space vastly, especially for high-dimensional problems on which traditional symbolic regression fails more easily.<ref name="ref_56c3a4c2">[https://towardsdatascience.com/real-world-applications-of-symbolic-regression-2025d17b88ef Real-world applications of symbolic regression]</ref>  | ||
| + | # Modeling data with symbolic regression has some advantages over modeling data with regular regressions, neural networks, or other mathematical tools.<ref name="ref_56c3a4c2" />  | ||
| + | # In these examples, given a set of distances traveled at increasing times, or a set of radiation intensities vs time, symbolic regressions would be expected to retrieve the respective equations.<ref name="ref_56c3a4c2" />  | ||
| + | # The core idea of the work is relatively simple: to build their new symbolic regression algorithm they combine neural network fitting with a set of physics-inspired constraints and equation features.<ref name="ref_56c3a4c2" />  | ||
| + | # Symbolic regression is a very interpretable machine learning algorithm for low-dimensional problems: these tools search equation space to find algebraic relations that approximate a dataset.<ref name="ref_22efed98">[https://github.com/MilesCranmer/PySR MilesCranmer/PySR: High-Performance Symbolic Regression in Python]</ref>  | ||
| + | # Here, one essentially uses symbolic regression to convert a neural net to an analytic equation.<ref name="ref_22efed98" />  | ||
| + | # The task of discovering the underlying equation from a set of input-output pairs is called symbolic regression.<ref name="ref_375a3ac2">[https://icml.cc/virtual/2021/poster/9713 Neural Symbolic Regression that scales]</ref>  | ||
| + | # Traditionally, symbolic regression methods use hand-designed strategies that do not improve with experience.<ref name="ref_375a3ac2" />  | ||
| + | # In this paper, we introduce the first symbolic regression method that leverages large scale pre-training.<ref name="ref_375a3ac2" />  | ||
| + | # Prof., Dr. Diveev is a renowned specialist in the field of control and a leading researcher in Russia in evolutionary computation and symbolic regression.<ref name="ref_9f06a472">[https://link.springer.com/book/10.1007/978-3-030-83213-1 Machine Learning Control by Symbolic Regression]</ref>  | ||
| + | # In this paper, we introduce the rst symbolic regression method that leverages large scale pre-training.<ref name="ref_2b5e758a">[http://proceedings.mlr.press/v139/biggio21a/biggio21a.pdf Neural symbolic regression that scales]</ref>  | ||
| + | # Symbolic regression is a branch of regression analysis that tries to emulate such a process.<ref name="ref_2b5e758a" />  | ||
| + | # Even assuming that the vocabulary of primitives e.g. {sin, exp, +, ...} is suf- cient to express the correct equation behind the observed data, symbolic regression is a hard problem to tackle.<ref name="ref_2b5e758a" />  | ||
| + | # In Section 3, we present our algorithm for neural symbolic regression that scales.<ref name="ref_2b5e758a" />  | ||
| + | # Symbolic regression is the process of constructing mathematical expressions that best fit given data sets, where a target variable is expressed in terms of input variables.<ref name="ref_14428991">[https://openaccess.wgtn.ac.nz/articles/thesis/Genetic_Programming_for_Symbolic_Regression_on_Incomplete_Data/17150609 Genetic Programming for Symbolic Regression on Incomplete Data]</ref>  | ||
| + | # The flexible representation of GP along with its ``white box" nature makes it a dominant method for symbolic regression.<ref name="ref_14428991" />  | ||
| + | # Data incompleteness is a pervasive problem in symbolic regression, and machine learning in general, especially when dealing with real-world data sets.<ref name="ref_14428991" />  | ||
| + | # Little attention has been paid to symbolic regression on incomplete data.<ref name="ref_14428991" />  | ||
| + | # In this approach, learning algorithms are used to generate new insights which can be added to domain knowledge bases supporting again symbolic regression.<ref name="ref_456f751e">[https://www.scirp.org/html/3-1680043_24181.htm Learn More about Your Data: A Symbolic Regression Knowledge Representation Framework]</ref>  | ||
| + | # This problem, known as symbolic regression, is relevant when one seeks to generate new physical knowledge and insights.<ref name="ref_412e6746">[https://www.ijcai.org/proceedings/2020/763 An Interactive Visualization Platform for Deep Symbolic Regression]</ref>  | ||
| + | # Since practitioners are primarily interested in knowledge generation, the ability to interact with a symbolic regression algorithm would be highly valuable.<ref name="ref_412e6746" />  | ||
| + | # Thus, we present an interactive symbolic regression framework that allows users not only to configure runs, but also to control the system during training.<ref name="ref_412e6746" />  | ||
| + | # The team focused on a type of discrete optimization called symbolic regression — finding short mathematical expressions that fit data gathered from an experiment.<ref name="ref_55e56093">[https://www.llnl.gov/news/novel-deep-learning-framework-symbolic-regression Novel deep learning framework for symbolic regression]</ref>  | ||
| + | # Symbolic regression is typically approached in machine learning and artificial intelligence with evolutionary algorithms, Petersen said.<ref name="ref_55e56093" />  | ||
| + | # Authors said the algorithm is widely applicable, not just to symbolic regression, but to any kind of discrete optimization problem.<ref name="ref_55e56093" />  | ||
| + | # Symbolic regression addresses this issue by searching the space of all possible free form equations that can be constructed from elementary algebraic functions.<ref name="ref_91b69a36">[https://medium.com/@monodeepets77/what-are-memetic-algorithms-810b00d23b40?source=user_profile---------5---------------------------- What are Memetic Algorithms ?]</ref>  | ||
| + | # Our experiments included fifteen other different machine learning approaches including five genetic programming methods for symbolic regression and ten machine learning methods.<ref name="ref_91b69a36" />  | ||
| + | # This chapter explores the use of symbolic regression to perform unsupervised learning by searching for implicit relationships of the form \(f(\vec{x}, y) = 0\).<ref name="ref_524d1d02">[https://www.semanticscholar.org/paper/Symbolic-Regression-of-Implicit-Equations-Schmidt-Lipson/59bece54b78fd1d30a3055aac1d44f4d567312d8 PDF Symbolic Regression of Implicit Equations]</ref>  | ||
| + | # b The flowchart of symbolic regression based on genetic programming (see more details of this flowchart and SR in Supplementary Information).<ref name="ref_0e8996c5">[https://www.nature.com/articles/s41467-020-17263-9 Simple descriptor derived from symbolic regression accelerating the discovery of new perovskite catalysts]</ref>  | ||
| + | # We propose a framework that leverages deep learning for symbolic regression via a simple idea: use a large model to search the space of small models.<ref name="ref_8be3f9c3">[https://iclr.cc/virtual/2021/oral/3539 Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients]</ref>  | ||
| + | # Symbolic regression has been one of the first applications of genetic programming and as such is tightly connected to evolutionary algorithms.<ref name="ref_5be83e1a">[https://heal.heuristiclab.com/research/symbolic-regression-workshop GECCO 2022 Workshop on Symbolic Regression]</ref>  | ||
| + | # In recent years several non-evolutionary techniques for solving symbolic regression have emerged.<ref name="ref_5be83e1a" />  | ||
| + | # Symbolic regression (SR) is an approach to machine learning (ML) in which both the parameters and structure of an analytical model are optimized.<ref name="ref_36d91a4d">[https://deepai.org/publication/contemporary-symbolic-regression-methods-and-their-relative-performance Contemporary Symbolic Regression Methods and their Relative Performance]</ref>  | ||
| + | # In this study we introduce a new technique for symbolic regression that guarantees global optimality.<ref name="ref_623ba5ab">[https://www.lix.polytechnique.fr/~liberti/interpretableML17.pdf Globally optimal symbolic regression]</ref>  | ||
| + | # There has been much research into improving symbolic regression techniques.<ref name="ref_623ba5ab" />  | ||
| + | # A symbolic regression scheme consists of a space of valid mathematical expressions together with a mechanism for its exploration.<ref name="ref_623ba5ab" />  | ||
| + | # We begin by describing symbolic regression and our implemen- tation of this technique using genetic programming.<ref name="ref_bafde3d8">[https://www.socsci.uci.edu/~duffy/papers/Usr.pdf Using symbolic regression to infer strategies]</ref>  | ||
| + | # In contrast to standard regression analysis, symbolic regression involves the breeding of simple computer programs or functions that are a good (cid:12)t to a given set of data.<ref name="ref_bafde3d8" />  | ||
| + | # We apply our symbolic regression algorithm to experimental data from the repeated ultimatum game.<ref name="ref_bafde3d8" />  | ||
| + | # Koza has termed the problem of (cid:12)nding a function, in symbolic form, that (cid:12)ts a (cid:12)nite sample of data as symbolic regression.<ref name="ref_bafde3d8" />  | ||
| + | # 5. Symbolic regression 2 Another example: the Rydberg formula Wavelength of spectral lines of the hydrogen atom: 1 vac = RH 1 n2 1 1 n2 2 Empirical formula that was guessed by Rydberg.<ref name="ref_36ccd829">[https://www.physi.uni-heidelberg.de/~reygers/lectures/2021/smipp/stat_methods_ws2021_A_selected_topic_5_symbolic_regression.pdf Statistical methods in particle physics]</ref>  | ||
| + | ===소스===  | ||
| + |  <references />  | ||
| + | |||
| + | == 메타데이터 ==  | ||
| + | |||
| + | ===위키데이터===  | ||
| + | * ID :  [https://www.wikidata.org/wiki/Q18171762 Q18171762]  | ||
| + | ===Spacy 패턴 목록===  | ||
| + | * [{'LOWER': 'symbolic'}, {'LEMMA': 'regression'}]  | ||
2022년 8월 10일 (수) 19:39 기준 최신판
노트
위키데이터
- ID : Q831366
 
말뭉치
- On the other hand, this work presents the first application of RANSAC to symbolic regression with GP, with impressive results.[1]
 - Discovering unmodeled components in astrodynamics with symbolic regression.[2]
 - The paper explores the use of symbolic regression to discover missing parts of the dynamics of space objects from tracking data.[2]
 - The paper presents a simple, yet representative, example of incomplete orbital dynamics to test the use of symbolic regression.[2]
 - The process of generating a computer program to fit numerical data is called symbolic regression.[3]
 - The authors showcase the potential of symbolic regression as an analytic method for use in materials research.[4]
 - Next, the authors discuss industrial applications of symbolic regression and its potential applications in materials science.[4]
 - In this prospective paper, we focus on an alternative to machine-learning models: symbolic regression.[5]
 - I think symbolic regression is a great tool to be aware of.[5]
 - By not requiring a specific model to be specified, symbolic regression isn't affected by human bias, or unknown gaps in domain knowledge.[6]
 - Symbolic regression is one of the best known problems in GP (see Reference).[7]
 - As any evolutionary program, symbolic regression needs (at least) two object types : an individual containing the genotype and a fitness.[7]
 - "Predicting friction system performance with symbolic regression and genetic programming with factor variables".[8]
 - “Prediction of Stress-Strain Curves for Aluminium Alloys using Symbolic Regression”.[8]
 - Symbolic regression is a data-based modelling method where the goal is to find a formula that describes given data.[8]
 - However, in symbolic regression one does not merely fit parameters to a fixed model structure.[8]
 - One of the more interesting ones is symbolic regression, which is used in Eureqa models within DataRobot.[9]
 - Perhaps the most common technique used in symbolic regression is genetic programming2 (GP).[9]
 - Let’s use the “Auto MPG” data from UCI: http://archive.ics.uci.edu/ml/datasets/Auto+MPG to understand symbolic regression.[9]
 - One of the key decisions in symbolic regression is how to represent the programs we are creating3.[9]
 - This was an example of symbolic regression: discovering a symbolic expression that accurately matches a given dataset.[10]
 - However, as we will see below, this does not prevent us from discovering and exploiting these properties to facilitate symbolic regression.[10]
 - Here is my code to compute Symbolic Regression and plot the function.[11]
 - Symbolic Regression also tries to fit observed experimental data.[12]
 - There are different ways to represent the solutions in Symbolic Regression.[12]
 - In Symbolic Regression, many initially random symbolic equations compete to model experimental data in the most promising way.[12]
 - Symbolic Regression is used to solve this task.[12]
 - It is known that symbolic regression is a widely used method for mathematical function approximation.[13]
 - The flowchart of symbolic regression based on genetic programming (see more details of this flowchart and SR in Supplementary Information).[14]
 - Symbolic regression (SR) is a powerful method for building predictive models from data without assuming any model structure.[15]
 - In the majority of work exploring symbolic regression, features are used directly without acknowledgement of their relative scale or unit.[16]
 - This paper extends recent work on the importance of standardisation of features when conducting symbolic regression.[16]
 - Other symbolic regression libraries Due to its popularity, symbolic regression is implemented by most genetic programming libraries.[17]
 - Right: gp-based symbolic regression finds different candidate control laws.[18]
 
소스
- ↑ RANSAC-GP: Dealing with Outliers in Symbolic Regression with Genetic Programming
 - ↑ 2.0 2.1 2.2 Discovering unmodeled components in astrodynamics with symbolic regression
 - ↑ Symbolic Regression Genetic Programming Example
 - ↑ 4.0 4.1 Symbolic regression in materials science
 - ↑ 5.0 5.1 gplearn Symbolic Regression
 - ↑ Symbolic regression
 - ↑ 7.0 7.1 Symbolic Regression Problem: Introduction to GP — DEAP 1.3.1 documentation
 - ↑ 8.0 8.1 8.2 8.3 Josef Ressel Centre for Symbolic Regression
 - ↑ 9.0 9.1 9.2 9.3 Symbolic Regression from Scratch with Python
 - ↑ 10.0 10.1 AI Feynman: A physics-inspired method for symbolic regression
 - ↑ How to get the function result from Symbolic Regression with R
 - ↑ 12.0 12.1 12.2 12.3 Learn More about Your Data: A Symbolic Regression Knowledge Representation Framework
 - ↑ Symbolic Regression Problems by Genetic Programming with Multi-branches
 - ↑ Simple descriptor derived from symbolic regression accelerating the discovery of new perovskite catalysts
 - ↑ Benchmarking state-of-the-art symbolic regression algorithms
 - ↑ 16.0 16.1 Feature standardisation and coefficient optimisation for effective symbolic regression
 - ↑ Glyph: Symbolic Regression Tools
 - ↑ (PDF) Glyph: Symbolic Regression Tools
 
메타데이터
위키데이터
- ID : Q831366
 
Spacy 패턴 목록
- [{'LOWER': 'decision'}, {'LEMMA': 'tree'}]
 
노트
말뭉치
- A decision boundary found with symbolic regression.[1]
 - Many free symbolic regression packages have been developed in the past, including notably gplearn but also many other small repositories that can be found on GitHub.[1]
 - No particular model is provided as a starting point for symbolic regression.[2]
 - This means that it will possibly take a symbolic regression algorithm longer to find an appropriate model and parametrization, than traditional regression techniques.[2]
 - All symbolic regression problems use an arbitrary data distribution, and try to fit the data with the most accurate symbolic formula available.[3]
 - As any evolutionary program, symbolic regression needs (at least) two object types : an individual containing the genotype and a fitness.[3]
 - In a symbolic regression optimization, it is important to discard a large formula if a smaller one with the same accuracy is encountered.[4]
 - These results open up new opportunities to explain symbolic regression models compared to the approximations provided by model-agnostic approaches.[5]
 - On the other hand, one needs to sample the equation search space vastly, especially for high-dimensional problems on which traditional symbolic regression fails more easily.[6]
 - Modeling data with symbolic regression has some advantages over modeling data with regular regressions, neural networks, or other mathematical tools.[6]
 - In these examples, given a set of distances traveled at increasing times, or a set of radiation intensities vs time, symbolic regressions would be expected to retrieve the respective equations.[6]
 - The core idea of the work is relatively simple: to build their new symbolic regression algorithm they combine neural network fitting with a set of physics-inspired constraints and equation features.[6]
 - Symbolic regression is a very interpretable machine learning algorithm for low-dimensional problems: these tools search equation space to find algebraic relations that approximate a dataset.[7]
 - Here, one essentially uses symbolic regression to convert a neural net to an analytic equation.[7]
 - The task of discovering the underlying equation from a set of input-output pairs is called symbolic regression.[8]
 - Traditionally, symbolic regression methods use hand-designed strategies that do not improve with experience.[8]
 - In this paper, we introduce the first symbolic regression method that leverages large scale pre-training.[8]
 - Prof., Dr. Diveev is a renowned specialist in the field of control and a leading researcher in Russia in evolutionary computation and symbolic regression.[9]
 - In this paper, we introduce the rst symbolic regression method that leverages large scale pre-training.[10]
 - Symbolic regression is a branch of regression analysis that tries to emulate such a process.[10]
 - Even assuming that the vocabulary of primitives e.g. {sin, exp, +, ...} is suf- cient to express the correct equation behind the observed data, symbolic regression is a hard problem to tackle.[10]
 - In Section 3, we present our algorithm for neural symbolic regression that scales.[10]
 - Symbolic regression is the process of constructing mathematical expressions that best fit given data sets, where a target variable is expressed in terms of input variables.[11]
 - The flexible representation of GP along with its ``white box" nature makes it a dominant method for symbolic regression.[11]
 - Data incompleteness is a pervasive problem in symbolic regression, and machine learning in general, especially when dealing with real-world data sets.[11]
 - Little attention has been paid to symbolic regression on incomplete data.[11]
 - In this approach, learning algorithms are used to generate new insights which can be added to domain knowledge bases supporting again symbolic regression.[12]
 - This problem, known as symbolic regression, is relevant when one seeks to generate new physical knowledge and insights.[13]
 - Since practitioners are primarily interested in knowledge generation, the ability to interact with a symbolic regression algorithm would be highly valuable.[13]
 - Thus, we present an interactive symbolic regression framework that allows users not only to configure runs, but also to control the system during training.[13]
 - The team focused on a type of discrete optimization called symbolic regression — finding short mathematical expressions that fit data gathered from an experiment.[14]
 - Symbolic regression is typically approached in machine learning and artificial intelligence with evolutionary algorithms, Petersen said.[14]
 - Authors said the algorithm is widely applicable, not just to symbolic regression, but to any kind of discrete optimization problem.[14]
 - Symbolic regression addresses this issue by searching the space of all possible free form equations that can be constructed from elementary algebraic functions.[15]
 - Our experiments included fifteen other different machine learning approaches including five genetic programming methods for symbolic regression and ten machine learning methods.[15]
 - This chapter explores the use of symbolic regression to perform unsupervised learning by searching for implicit relationships of the form \(f(\vec{x}, y) = 0\).[16]
 - b The flowchart of symbolic regression based on genetic programming (see more details of this flowchart and SR in Supplementary Information).[17]
 - We propose a framework that leverages deep learning for symbolic regression via a simple idea: use a large model to search the space of small models.[18]
 - Symbolic regression has been one of the first applications of genetic programming and as such is tightly connected to evolutionary algorithms.[19]
 - In recent years several non-evolutionary techniques for solving symbolic regression have emerged.[19]
 - Symbolic regression (SR) is an approach to machine learning (ML) in which both the parameters and structure of an analytical model are optimized.[20]
 - In this study we introduce a new technique for symbolic regression that guarantees global optimality.[21]
 - There has been much research into improving symbolic regression techniques.[21]
 - A symbolic regression scheme consists of a space of valid mathematical expressions together with a mechanism for its exploration.[21]
 - We begin by describing symbolic regression and our implemen- tation of this technique using genetic programming.[22]
 - In contrast to standard regression analysis, symbolic regression involves the breeding of simple computer programs or functions that are a good (cid:12)t to a given set of data.[22]
 - We apply our symbolic regression algorithm to experimental data from the repeated ultimatum game.[22]
 - Koza has termed the problem of (cid:12)nding a function, in symbolic form, that (cid:12)ts a (cid:12)nite sample of data as symbolic regression.[22]
 - 5. Symbolic regression 2 Another example: the Rydberg formula Wavelength of spectral lines of the hydrogen atom: 1 vac = RH 1 n2 1 1 n2 2 Empirical formula that was guessed by Rydberg.[23]
 
소스
- ↑ 1.0 1.1 Symbolic regression software
 - ↑ 2.0 2.1 Symbolic regression
 - ↑ 3.0 3.1 Symbolic Regression Problem: Introduction to GP — DEAP 1.3.3 documentation
 - ↑ Symbolic Regression: The Forgotten Machine Learning Method
 - ↑ Measuring feature importance of symbolic regression models using partial effects
 - ↑ 6.0 6.1 6.2 6.3 Real-world applications of symbolic regression
 - ↑ 7.0 7.1 MilesCranmer/PySR: High-Performance Symbolic Regression in Python
 - ↑ 8.0 8.1 8.2 Neural Symbolic Regression that scales
 - ↑ Machine Learning Control by Symbolic Regression
 - ↑ 10.0 10.1 10.2 10.3 Neural symbolic regression that scales
 - ↑ 11.0 11.1 11.2 11.3 Genetic Programming for Symbolic Regression on Incomplete Data
 - ↑ Learn More about Your Data: A Symbolic Regression Knowledge Representation Framework
 - ↑ 13.0 13.1 13.2 An Interactive Visualization Platform for Deep Symbolic Regression
 - ↑ 14.0 14.1 14.2 Novel deep learning framework for symbolic regression
 - ↑ 15.0 15.1 What are Memetic Algorithms ?
 - ↑ PDF Symbolic Regression of Implicit Equations
 - ↑ Simple descriptor derived from symbolic regression accelerating the discovery of new perovskite catalysts
 - ↑ Deep symbolic regression: Recovering mathematical expressions from data via risk-seeking policy gradients
 - ↑ 19.0 19.1 GECCO 2022 Workshop on Symbolic Regression
 - ↑ Contemporary Symbolic Regression Methods and their Relative Performance
 - ↑ 21.0 21.1 21.2 Globally optimal symbolic regression
 - ↑ 22.0 22.1 22.2 22.3 Using symbolic regression to infer strategies
 - ↑ Statistical methods in particle physics
 
메타데이터
위키데이터
- ID : Q18171762
 
Spacy 패턴 목록
- [{'LOWER': 'symbolic'}, {'LEMMA': 'regression'}]