Enhancement of adapted lesk algorithm applied on word sense disambiguation / Magat, Angelito P. and Deodato, Karl Deo L. 6

By: Magat, Angelito P. and Deodato, Karl Deo L. 4 0 16, [, ] | [, ] |
Contributor(s): 5 6 [] |
Language: Unknown language code Summary language: Unknown language code Original language: Unknown language code Series: ; March 2011.46Edition: Description: 28 cm. 47 ppContent type: text Media type: unmediated Carrier type: volumeISBN: ISSN: 2Other title: 6 []Uniform titles: | | Related works: 1 40 6 []Subject(s): -- 2 -- 0 -- -- | -- 2 -- 0 -- 6 -- | 2 0 -- | -- -- 20 -- | | -- -- -- -- 20 -- | -- -- -- 20 -- --Genre/Form: -- 2 -- Additional physical formats: DDC classification: | LOC classification: | | 2Other classification:
Contents:
Action note: In: Summary: ABSTRACT: People today uses the internet and computers for almost everything they do may it be personal or for professional use. In the fast pace life people live on. We tend to seek automation on most of our traditional activities. A very small fracture of all those activities is disambiguating words. The research that we present is about the Enhancement of Adapted Lesk Algorithm Applied on Word Sense Disambiguation. It uses the Adapted Lesk Algorithm as its main backbone in the process of word disambiguation where it is applied. The algorithm has been carefully analysed and broken down to ensure great improvement from its previous version. The existing algorithm is also capable on breaking down the senses of words to disambiguate any given group of words or sentence; however it also flaws and rooms for improvement. One of which is its capability of disambiguating a word in a holistic approach. The existing is unapparent of using the efficient approach that would cover all the words in any given example. Another of which is its incapability of utilizing the high probability that words in any given context are most likely having the same sense. Last among the deficiencies we have come up on the algorithm is the identification of relative sense which would greatly enhance its capability to disambiguate words. As the study progressed the researchers have come up on how the deficiencies of the existing algorithm be corrected. For the first mentioned problem, we made the algorithm use a holistic approach of disambiguation which is Global Disambiguation. For the following mentioned problem, the researchers came up on monosemy assurance among words to be disambiguated. Finally the algorithm is added of a filter that would remove unnecessary senses. With all this, the algorithm came up to be more efficient and accurate and at the same time more reliable through the different analysis and logical reasoning the proponents endeavoured upon. Other editions:
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
Item type Current location Home library Collection Call number Status Date due Barcode Item holds
Book PLM
PLM
Filipiniana Section
Filipiniana-Thesis T QA76.63.M34.2011 (Browse shelf) Available FT6155
Total holds: 0

Undergraduate Thesis: (BSCS major in Computer Science) - Pamantasan ng Lungsod ng Maynila, 2011. 56

5

ABSTRACT: People today uses the internet and computers for almost everything they do may it be personal or for professional use. In the fast pace life people live on. We tend to seek automation on most of our traditional activities. A very small fracture of all those activities is disambiguating words. The research that we present is about the Enhancement of Adapted Lesk Algorithm Applied on Word Sense Disambiguation. It uses the Adapted Lesk Algorithm as its main backbone in the process of word disambiguation where it is applied. The algorithm has been carefully analysed and broken down to ensure great improvement from its previous version. The existing algorithm is also capable on breaking down the senses of words to disambiguate any given group of words or sentence; however it also flaws and rooms for improvement. One of which is its capability of disambiguating a word in a holistic approach. The existing is unapparent of using the efficient approach that would cover all the words in any given example. Another of which is its incapability of utilizing the high probability that words in any given context are most likely having the same sense. Last among the deficiencies we have come up on the algorithm is the identification of relative sense which would greatly enhance its capability to disambiguate words. As the study progressed the researchers have come up on how the deficiencies of the existing algorithm be corrected. For the first mentioned problem, we made the algorithm use a holistic approach of disambiguation which is Global Disambiguation. For the following mentioned problem, the researchers came up on monosemy assurance among words to be disambiguated. Finally the algorithm is added of a filter that would remove unnecessary senses. With all this, the algorithm came up to be more efficient and accurate and at the same time more reliable through the different analysis and logical reasoning the proponents endeavoured upon.

5

There are no comments for this item.

to post a comment.

© Copyright 2024 Phoenix Library Management System - Pinnacle Technologies, Inc. All Rights Reserved.