In this paper, we present a fully Lagrangian method based on the radial basis function (RBF) finite difference (FD) method for solving convection–diffusion partial differential equations (PDEs) on evolving surfaces. The relation between the Hawkes process and an existing model for Type-I words, in which hierarchical structures of written texts are considered to play a central role in yielding dynamic correlations, will also be discussed. In this study, we introduce a Hawkes process, which is known as a kind of self-exciting point process, as a candidate for the stochastic process that governs yielding Type-I words i.e., the purpose of this study is to establish that the Hawkes process is useful to model occurrence patterns of Type-I words in real written texts. Although the stochastic process of yielding Type-II words has been clarified to be a superposition of Poisson point processes with various intensities, there is no definitive model for Type-I words. The Type-I words are words that exhibit long-range dynamic correlations in written texts while the Type-II words do not show any type of dynamic correlations. It has been clarified that words in written texts are classified into two groups called Type-I and Type-II words. The proposed model shows potential advantages for overcoming difficulties due to the disambiguation of complex language and potential improvements for human language methods. Although the methodology is not proposed for human language sentence detection, we provide some examples using human language corpora where potentially useful results are shown. We consider relevant existing performance metrics such as the F-measure and indicate limitations, leading to the introduction of a new information-theoretic global performance based on modeled distributions. It is not suggested as a fully viable alternative for sentence boundary detection per se but provides a new conceptual method for estimating the structure and natural limits of information flow in language sequences but without any semantic knowledge. We introduce an information topology view of the incremental information and incremental tangent angle of the Wasserstein-1 distance of the probabilistic symbolic language input. We extend the basis of information geometry and propose a new concept, which we term information topology, to model the incremental flow of information in natural sequences. Our approach is to consider the problem from the perspective of information theory. The problem is how to do this with no linguistic or semantic language model. In this paper, we are interested in exploring how it may be possible to discover natural `chunks’ in synthetic language sequences in terms of sentence-like units. A particular challenge for synthetic languages is that there is generally no a priori language model available, which limits the use of many natural language processing methods. These synthetic languages are characterized by a small alphabet and limited vocabulary and grammatical structure. Recently we have been exploring synthetic languages based on the concept of modeling behaviors using emergent languages. While this topic has been considered using a range of techniques, including rule-based approaches and supervised and unsupervised algorithms, a common aspect of these methods is that they inherently rely on a priori knowledge of human language in one form or another. Companion journals for Entropy include: Foundations, Thermo and MAKE.Įstimating sentence-like units and sentence boundaries in human language is an important task in the context of natural language understanding.Testimonials: See what our editors and authors say about Entropy. Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.This journal in the second half of 2021). To publication is undertaken in 3.4 days (median values for papers published in Rapid Publication: manuscripts are peer-reviewed and a firstĭecision provided to authors approximately 18.5 days after submission acceptance.Journal Rank: JCR - Q2 ( Physics, Multidisciplinary) / CiteScore - Q1 ( Mathematical Physics).High Visibility: indexed within Scopus, SCIE (Web of Science), MathSciNet, Inspec, PubMed, PMC, Astrophysics Data System, and many other databases.Open Access- free for readers, with article processing charges (APC) paid by authors or their institutions.The International Society for the Study of Information (IS4SI) and Spanish Society of Biomedical Engineering (SEIB) are affiliated with Entropy and their members receive a discount on the article processing charge. Is an international and interdisciplinary peer-reviewed open access journal of entropy and information studies, published monthly online by MDPI.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |