Symbolic verification of web crawler functionality and its properties

Keerthi S. Shetty, Swaraj Bhat, Sanjay Singh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Now a days people use search engines every now and then to retrieve documents from the Web. Web crawling is the process by which a search engine gather pages from the Web to index them and support a search engine. Web crawlers are the heart of search engines. Web crawlers continuously keep on crawling the web and find any new web pages that have been added to the web, pages that have been removed from the web. Due to growing and dynamic nature of the web, it has become a challenge to traverse through all URLs in the web documents and to handle these URLs. The entire crawling process may be viewed as traversing a web graph. The aim of this paper is to model check the crawling process and crawler properties using a symbolic model checker tool called NuSMV. The basic operation of a hypertext crawler and the crawler properties has been modeled in terms of CTL specification and it is observed that the system takes care of all the constraints by satisfying all the specifications.

Original languageEnglish
Title of host publication2012 International Conference on Computer Communication and Informatics, ICCCI 2012
DOIs
Publication statusPublished - 27-03-2012
Event2012 International Conference on Computer Communication and Informatics, ICCCI 2012 - Coimbatore, India
Duration: 10-01-201212-01-2012

Conference

Conference2012 International Conference on Computer Communication and Informatics, ICCCI 2012
CountryIndia
CityCoimbatore
Period10-01-1212-01-12

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems

Fingerprint Dive into the research topics of 'Symbolic verification of web crawler functionality and its properties'. Together they form a unique fingerprint.

Cite this