Chilkat Examples

ChilkatHOME.NET Core C#Android™AutoItCC#C++Chilkat2-PythonCkPythonClassic ASPDataFlexDelphi ActiveXDelphi DLLGoJavaLianjaMono C#Node.jsObjective-CPHP ActiveXPHP ExtensionPerlPowerBuilderPowerShellPureBasicRubySQL ServerSwift 2Swift 3,4,5...TclUnicode CUnicode C++VB.NETVBScriptVisual Basic 6.0Visual FoxProXojo Plugin

VB.NET Examples

Web API Categories

ASN.1
AWS KMS
AWS Misc
Amazon EC2
Amazon Glacier
Amazon S3
Amazon S3 (new)
Amazon SES
Amazon SNS
Amazon SQS
Async
Azure Cloud Storage
Azure Key Vault
Azure Service Bus
Azure Table Service
Base64
Bounced Email
Box
CAdES
CSR
CSV
Certificates
Cloud Signature CSC
Code Signing
Compression
DKIM / DomainKey
DNS
DSA
Diffie-Hellman
Digital Signatures
Dropbox
Dynamics CRM
EBICS
ECC
Ed25519
Email Object
Encryption
FTP
FileAccess
Firebase
GMail REST API
GMail SMTP/IMAP/POP
Geolocation
Google APIs
Google Calendar
Google Cloud SQL
Google Cloud Storage
Google Drive
Google Photos
Google Sheets
Google Tasks
Gzip
HTML-to-XML/Text
HTTP

HTTP Misc
IMAP
JSON
JSON Web Encryption (JWE)
JSON Web Signatures (JWS)
JSON Web Token (JWT)
Java KeyStore (JKS)
MHT / HTML Email
MIME
MS Storage Providers
Microsoft Graph
Misc
NTLM
OAuth1
OAuth2
OIDC
Office365
OneDrive
OpenSSL
Outlook
Outlook Calendar
Outlook Contact
PDF Signatures
PEM
PFX/P12
PKCS11
POP3
PRNG
REST
REST Misc
RSA
SCP
SCard
SFTP
SMTP
SSH
SSH Key
SSH Tunnel
ScMinidriver
SharePoint
SharePoint Online
Signing in the Cloud
Socket/SSL/TLS
Spider
Stream
Tar Archive
ULID/UUID
Upload
WebSocket
XAdES
XML
XML Digital Signatures
XMP
Zip
curl
uncategorized

 

 

 

(VB.NET) A Simple Web Crawler

This demonstrates a very simple web crawler using the Chilkat Spider component.

Chilkat .NET Downloads

Chilkat .NET Framework

Chilkat for .NET Core

Dim spider As New Chilkat.Spider

Dim seenDomains As New Chilkat.StringArray
Dim seedUrls As New Chilkat.StringArray

seenDomains.Unique = True
seedUrls.Unique = True

' You will need to change the start URL to something else...
seedUrls.Append("http://something.whateverYouWant.com/")

' Set outbound URL exclude patterns
' URLs matching any of these patterns will not be added to the 
' collection of outbound links.
spider.AddAvoidOutboundLinkPattern("*?id=*")
spider.AddAvoidOutboundLinkPattern("*.mypages.*")
spider.AddAvoidOutboundLinkPattern("*.personal.*")
spider.AddAvoidOutboundLinkPattern("*.comcast.*")
spider.AddAvoidOutboundLinkPattern("*.aol.*")
spider.AddAvoidOutboundLinkPattern("*~*")

' Use a cache so we don't have to re-fetch URLs previously fetched.
spider.CacheDir = "c:/spiderCache/"
spider.FetchFromCache = True
spider.UpdateCache = True

While seedUrls.Count > 0

    Dim url As String = seedUrls.Pop()
    spider.Initialize(url)

    ' Spider 5 URLs of this domain.
    ' but first, save the base domain in seenDomains
    Dim domain As String = spider.GetUrlDomain(url)
    seenDomains.Append(spider.GetBaseDomain(domain))

    Dim i As Integer
    Dim success As Boolean
    For i = 0 To 4
        success = spider.CrawlNext()
        If (success = True) Then

            ' Display the URL we just crawled.
            Debug.WriteLine(spider.LastUrl)

            ' If the last URL was retrieved from cache,
            ' we won't wait.  Otherwise we'll wait 1 second
            ' before fetching the next URL.
            If (spider.LastFromCache <> True) Then
                spider.SleepMs(1000)
            End If


        Else
            ' cause the loop to exit..
            i = 999
        End If


    Next

    ' Add the outbound links to seedUrls, except
    ' for the domains we've already seen.
    For i = 0 To spider.NumOutboundLinks - 1

        url = spider.GetOutboundLink(i)
        Dim domain As String = spider.GetUrlDomain(url)
        Dim baseDomain As String = spider.GetBaseDomain(domain)
        If (seenDomains.Contains(baseDomain) = False) Then
            ' Don't let our list of seedUrls grow too large.
            If (seedUrls.Count < 1000) Then
                seedUrls.Append(url)
            End If

        End If


    Next

End While

 

© 2000-2024 Chilkat Software, Inc. All Rights Reserved.