IJSTR

International Journal of Scientific & Technology Research

Home About Us Scope Editorial Board Blog/Latest News Contact Us
0.2
2019CiteScore
 
10th percentile
Powered by  Scopus
Scopus coverage:
Nov 2018 to May 2020

CALL FOR PAPERS
AUTHORS
DOWNLOADS
CONTACT

IJSTR >> Volume 1 - Issue 4, May 2012 Edition



International Journal of Scientific & Technology Research  
International Journal of Scientific & Technology Research

Website: http://www.ijstr.org

ISSN 2277-8616



A NOVEL AND EFFICIENT KNN USING MODIFIED APRIORI ALGORITHM

[Full Text]

 

AUTHOR(S)

Ritika Agarwal , Dr. Barjesh Kochar , Deepesh Srivastava

 

KEYWORDS

Classification, association set rules, K-nearest neighbor, apriori algorithm

 

ABSTRACT

In the field of data mining, classification and association set rules are two of very important techniques to find out new patterns. K-nearest neighbor and apriori algorithm are most usable methods of classification and association set rules respectively. However, individually they face few challenges, such as, time utilization and inefficiency for very large databases. The current paper attempts to use both the methods hand in hand. Here, we have modified the apriori algorithm and used it to classify data for K-nearest neighbor. Modified Apriori helps in finding out only a few of the attributes that mainly define the class. These attributes are named as prominent attributes in this paper. This technique helps in improving the efficiency of KNN to a high extent.

 

REFERENCES

[1] Han, Jiawei and Kamber, Micheline, Data Mining Concepts and Techniques. Morgan Kaufman Publishers. San Fransisco 2000.
[2] Han, David, et al. Principles of Data Mining: MIT press. Cambridge, 2001.
[3] G.K. Gupta,Introduction to data mining with case studies:Prentics Hall of India, New Delhi, 2006
[4] Top 10 algorithms in data mining, Xindong Wu, Springer-2007
[5] Agrawal R, Srikant R (1994) Fast algorithms for mining association rules. In: Proceedings of the 20th VLDB conference, pp 487–499
[6] Mining Association Rules between Sets of Items in Large Databases: Rakesh Agrawal ,Tomasz Imielinski,Arun SwamiACM SIGMOD ConferenceWashington DC, USA, May 1993
[7] Fast Algorithms for Mining Association Rules: Rakesh Agrawal Ramakrishnan Srikant VLDB Conference Santiago, Chile, 1994
[8] High Performance Data Mining Using the Nearest Neighbor Join Christian Böhm Florian Krebs
[9] A Review of various k-Nearest Neighbor Query Processing Techniques : International Journal of Computer Applications (0975 – 8887) Volume 31– No.7, October 2011
[10]Mining of Meteorological Data Using Modified Apriori Algorithm, European Journal of Scientific Research ISSN 1450-216X Vol.47 No.2 (2010), pp.295-308EuroJournals Publishing, Inc. 2010 http://www.eurojournals.com/ejsr.htm
[11] Lailil Muflikhah1, Classifying Categorical Data Using Modified K-Nearest Neighbor Weighted by Association Rules, 2011 International Conference on Future Information Technology IPCSIT vol.13 (2011) © (2011) IACSIT Press, Singapore
[12]Bailey, T., Jain, A. A note on distance-weighted k-nearest neighbor rules. IEEE Trans. Systems, Man, Cybernetics, Vol. 8, pp. 311-313,1978.
[13]KNNBA: K-Nearest-Neighbor-Based-Association Algorithm. 1Mehdi Moradian, 2Ahmad Baraani, Journal of Theoretical and Applied Information Technology,2009
[14]Thair Nu Phyu Survey of Classification Techniques in Data Mining, Survey of Classification Techniques in Data Mining, Proceedings of the International Multi Conference of Engineers and Computer Scientists 2009 Vol I
[15] http://en.wikipedia.org/wiki/K-nearest_neighbor_algorithm
[16] Kozak K, M. Kozak, K. Stapor ,Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data ,International Journal of Biological and Life Sciences 1:3 2005
[17] http://www.mathworks.in/help/toolbox/stats/bsehyju-1.html
[18] http://www.d.umn.edu/~deoka001/WKNN.html
[19] http://archive.ics.uci.edu/ml/machine-learning-databases
[20]Ulmer, David. Mining an Online Auctions Data Warehouse, 2002. (http://csis.pace.edu/csis/msplas/p8.pdf)
[21]Wettschereck D, Aha D,Mohri T (1997) A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Artif Intell Rev 11:273–314.