Hinweis: Das Forum wird geschlossen! Neue Registrierungen sind nicht mehr möglich!

 Zurück zu Pro-Linux   Foren-Übersicht   FAQ     Suchen    Mitgliederliste
Mit windows-Netscape über Linux insd Internet

 
Neuen Beitrag schreiben   Auf Beitrag antworten    Pro-Linux Foren-Übersicht -> Netzwerk
Vorheriges Thema anzeigen :: Nächstes Thema anzeigen  
Autor Nachricht
ek
Gast





BeitragVerfasst am: 27. Mai 2001 10:31   Titel: Mit windows-Netscape über Linux insd Internet

ek

Welche Einstellungen muss ich bei Netscape unter MS-Win machen, damit ich via SuSE 7.1 ins Internet komme?

Die Netzwerkverbindung steht (Die Linux Maschine läuft z.B. als Printserver) und von der Linux Maschine komme ich ohne Probleme ins Internet um zB diese email zu senden.

Vielen Dank für eure Hilfe

ek
 

Ansgar Jazdzewski
Gast





BeitragVerfasst am: 27. Mai 2001 17:06   Titel: Re: Mit windows-Netscape über Linux insd Internet

Also ich mache das mit wwwoffle (das ist ein proxy server)

den muß man noch richtig configuriren (das machst du unter /etc/wwwoffle/wwwoffle.conf) dar mußt du eigendlich nur ein paar einstellungen verändern

so dann noch in Netscape einen Proxyserver einstellen (IP des Linuxrechners und port wo der proxy leuft normalerweise 8080)

hir ist noch meine config (ist von suse linux)

#
# WWWOFFLE - World Wide Web Offline Explorer - Version 2.6.
#
# WWWOFFLE Configuration file /etc/wwwoffle/wwwoffle.conf
#
# Example configuration file Copyright 1997,98,99,2000 Andrew M. Bishop.
# They may be distributed under the GNU Public License, version 2, or
# any higher version. See section COPYING of the GNU Public license
# for conditions under which this file may be redistributed.
#


#
# The configuration file (wwwoffle.conf) specifies all of the parameters that
# control the operation of the proxy server. The file is split into sections
# each containing a series of parameters as described below. The file
# CHANGES.CONF explains the changes in the configuration file between this
# version of the program and previous ones.
#
# The file is split into sections, each of which can be empty or contain one or
# more lines of configuration information. The sections are named and the order
# that they appear in the file is not important.
#
# The general format of each of the sections is the same. The name of the
# section is on a line by itself to mark the start. The contents of the section
# are enclosed between a pair of lines containing only the '{' and '}'
# characters or the '[' and ']' characters. When the '{' and '}' characters are
# used the lines between contain configuration information. When the '[' and
# ']' characters are used the there must only be a single non-empty line between
# them that contains the name of a file (in the same directory) containing the
# configuration information for the section.
#
# Comments are marked by a '#' character at the start of the line and they are
# ignored. Blank lines are also allowed and ignored.
#
# The phrases URL-SPECIFICATION (or URL-SPEC for short) and WILDCARD have
# specific meanings in the configuration file and are described at the end. Any
# item enclosed in '(' and ')' in the descriptions means that it is a parameter
# supplied by the user, anything enclosed in '[' and ']' is optional, the '|'
# symbol is used to denote alternate choices. Some options apply to specific
# URLs only, this is indicated by having a URL-SPECIFICATION enclosed between
# '<' & '>' in the option, the first URL-SPECIFICATION to match is used. If no
# URL-SPECIFICATION is given then it matches all URLs.
#


#
# StartUp
# -------
#
# This contains the parameters that are used when the program starts, changes to
# these are ignored if the configuration file is re-read while the program is
# running.
#
# http-port = (port)
# An integer specifying the port number for the HTTP proxy to use
# (default=8080).
#
# wwwoffle-port = (port)
# An integer specifying the port number for the WWWOFFLE control
# connections to use (default=8081).
#
# spool-dir = (dir)
# The full pathname of the top level cache directory (spool directory)
# (default=/var/spool/wwwoffle).
#
# run-uid = (user) | (uid)
# The username or numeric uid to change to when the WWWOFFLE server is
# started (default=none). This option is not applicable to win32
# and only works if the server is started by root on UNIX.
#
# run-gid = (group) | (gid)
# The groupname or numeric gid to change to when the WWWOFFLE server is
# started (default=none). This option is not applicable to win32
# and only works if the server is started by root on UNIX.
#
# use-syslog = yes | no
# Whether to use the syslog facility for messages or not (default=yes).
#
# password = (word)
# The password used for authentication of the control pages, for
# deleting cached pages etc (default=none). For the password to be
# secure the configuration file must be set so that only authorised
# users can read it.
#
# max-servers = (integer)
# The maximum number of server processes that are started for online and
# automatic fetching (default=Cool.
#
# max-fetch-servers = (integer)
# The maximum number of server processes that are started to fetch pages
# that were marked in offline mode (default=4). This value must be less
# than max-servers or you will not be able to use WWWOFFLE interactively
# online while fetching.
#

StartUp
{
http-port = 8080
wwwoffle-port = 8081

spool-dir = /var/spool/wwwoffle

run-uid = wwwrun
run-gid = nogroup

use-syslog = yes

password = none

max-servers = 8
max-fetch-servers = 4
}


#
# Options
# -------
#
# Options that control how the program works.
#
# log-level = debug | info | important | warning | fatal
# The minimum log level for messages in syslog or stderr
# (default=important).
#
# socket-timeout = (time)
# The time in seconds that WWWOFFLE will wait for data on a socket
# connection before giving up (default=120).
#
# dns-timeout = (time)
# The time in seconds that WWWOFFLE will wait for a DNS (Domain Name
# Service) lookup before giving up (default=60).
#
# connect-timeout = (time)
# The time in seconds that WWWOFFLE will wait for the socket connection
# to be made before giving up (default=30).
#
# connect-retry = yes | no
# If a connection cannot be made to a remote server then WWWOFFLE should
# try again after a short delay (default=no).
#
# ssl-allow-port = (integer)
# A port number that is allowed to be proxied for Secure Socket Layer
# (SSL) connections, e.g. https. This option should be set to 443 to
# allow https, there can be more than one ssl-port entry for other ports
# as required.
#
# dir-perm = (octal int)
# The directory permissions to use when creating spool directories
# (default=0755). This option overrides the umask of the user and must
# be in octal starting with a '0'.
#
# file-perm = (octal int)
# The file permissions to use when creating spool files (default=0644).
# This option overrides the umask of the user and must be in octal
# starting with a '0'.
#
# run-online = (filename)
# The name of a program to run when WWWOFFLE is switched to online mode
# (default=none). The program is started with a single parameter set to
# the current mode name "online".
#
# run-offline = (filename)
# The name of a program to run when WWWOFFLE is switched to offline mode
# (default=none). The program is started with a single parameter set to
# the current mode name "offline".
#
# run-autodial = (filename)
# The name of a program to run when WWWOFFLE is switched to autodial
# mode (default=none). The program is started with a single parameter
# set to the current mode name "autodial".
#
# run-fetch = (filename)
# The name of a program to run when a WWWOFFLE fetch starts or stops
# (default=none). The program is started with two parameters, the first
# is the word "fetch" and the second is "start" or "stop".
#
# lock-files = yes | no
# Enable the use of lock files to stop more than one WWWOFFLE process
# from downloading the same URL at the same time (default=no).
#

Options
{
log-level = important

socket-timeout = 120
dns-timeout = 60
connect-timeout = 30

connect-retry = no

ssl-allow-port = 443

dir-perm = 0755
file-perm = 0644

lock-files = no
}


#
# OnlineOptions
# -------------
#
# Options that control how WWWOFFLE behaves when it is online.
#
# [<URL-SPEC>] request-changed = (time)
# While online pages will only be fetched if the cached version is older
# than this specified time in seconds (default=600). Setting this value
# negative will indicate that cached pages are always used while online.
# Longer times can be specified with a 'm', 'h', 'd' or 'w' suffix for
# minutes, hours, days or weeks (e.g. 10m=600).
#
# [<URL-SPEC>] request-changed-once = yes | no
# While online pages will only be fetched if the cached version has not
# already been fetched once this session online (default=yes). This
# option takes precedence over the request-changed option.
#
# [<URL-SPEC>] request-expired = yes | no
# While online pages that have expired will always be requested again
# (default=no). This option takes precedence over the request-changed
# and request-changed-once options.
#
# [<URL-SPEC>] request-no-cache = yes | no
# While online pages that ask not to be cached will always be requested
# again (default=no). This option takes precedence over the
# request-changed and request-changed-once options.
#
# [<URL-SPEC>] try-without-password = yes | no
# If a request is made for a page that contains a username and password
# then a request is made for the same page without a username and
# password specified (default=yes). This allows for requests for the
# page without a password to re-direct the browser to the passworded
# version.
#
# [<URL-SPEC>] intr-download-keep = yes | no
# If the browser closes the connection while online the currently
# downloaded incomplete page should be kept (default=no).
#
# [<URL-SPEC>] intr-download-size = (integer)
# If the browser closes the connection while online the page should
# continue to download if it is smaller than this size in kB
# (default=1).
#
# [<URL-SPEC>] intr-download-percent = (integer)
# If the browser closes the connection while online the page should
# continue to download if it is more than this percentage complete
# (default=80).
#
# [<URL-SPEC>] timeout-download-keep = yes | no
# If the server connection times out while reading then the currently
# downloaded incomplete page should be kept (default=no).
#

OnlineOptions
{
request-changed = 10m

request-changed-once = yes

request-expired = no

request-no-cache = no

try-without-password = yes

intr-download-keep = no
intr-download-size = 1
intr-download-percent = 80

timeout-download-keep = no
}


#
# OfflineOptions
# --------------
#
# [<URL-SPEC>] pragma-no-cache = yes | no
# Whether to request a new copy of a page if the request from the
# browser has 'Pragma: no-cache' (default=yes). This option option
# should be set to 'no' if when browsing offline all pages are
# re-requested by a 'broken' browser.
#
# [<URL-SPEC>] confirm-requests = yes | no
# Whether to return a page requiring user confirmation instead of
# automatically recording requests made while offline (default=no).
#
# [<URL-SPEC>] dont-request = yes | no
# Do not request any URLs that match this when offline (default=no).
#

OfflineOptions
{
pragma-no-cache = yes

confirm-requests = no

#### Example ####
# Dont request any URLs at all when offline.
# <*://*/*> dont-request = yes
}


#
# FetchOptions
# ------------
#
# Options that control what is downloaded when fetching pages that were
# requested while offline.
#
# stylesheets = yes | no
# If style sheets are to be fetched (default=no).
#
# images = yes | no
# If images are to be fetched (default=no).
#
# frames = yes | no
# If frames are to be fetched (default=no).
#
# scripts = yes | no
# If scripts (e.g. Javascript) are to be fetched (default=no).
#
# objects = yes | no
# If objects (e.g. Java class files) are to be fetched (default=no).
#

FetchOptions
{
stylesheets = no

images = yes

frames = yes

scripts = no

objects = no
}


#
# IndexOptions
# ------------
#
# Options that control what is displayed in the indexes.
#
# no-lasttime-index = yes | no
# Disables creation of the lasttime/prevtime indexes (default=no).
#
# <URL-SPEC> list-outgoing = yes | no
# Choose if the URL is to be listed in the outgoing index (default=yes).
#
# <URL-SPEC> list-latest = yes | no
# Choose if the URL is to be listed in the lasttime/prevtime and
# lastout/prevout indexes (default=yes).
#
# <URL-SPEC> list-monitor = yes | no
# Choose if the URL is to be listed in the monitor index (default=yes).
#
# <URL-SPEC> list-host = yes | no
# Choose if the URL is to be listed in the host indexes (default=yes).
#
# <URL-SPEC> list-any = yes | no
# Choose if the URL is to be listed in any of the indexes (default=yes).
#

IndexOptions
{
no-lasttime-index = no

#### Example ####
# Don't index any hosts in the barfoo.com domain.
# <*://*.barfoo.com/> list-any = no
# Don't index any gif or jpg files in the lasttime index.
# <*://*/*.gif> list-latest = no
# <*://*/*.jpg> list-latest = no
}


#
# ModifyHTML
# ----------
#
# Options that control how the HTML that is provided from the cache is modified.
#
# [<URL-SPEC>] enable-modify-html = yes | no
# Enable the HTML modifications in this section (default=no). With this
# option disabled the following HTML options will not have any effect.
# With this option enabled there is a small speed penalty.
#
# [<URL-SPEC>] add-cache-info = yes | no
# At the bottom of all of the spooled pages the date that the page was
# cached and some navigation buttons are to be added (default=no).
#
# [<URL-SPEC>] anchor-cached-begin = (HTML code)
# Anchors (links) in the spooled page that are in the cache are to have
# the specified HTML inserted at the beginning (default="").
#
# [<URL-SPEC>] anchor-cached-end = (HTML code)
# Anchors (links) in the spooled page that are in the cache are to have
# the specified HTML inserted at the end (default="").
#
# [<URL-SPEC>] anchor-requested-begin = (HTML code)
# Anchors (links) in the spooled page that have been requested for
# download are to have the specified HTML inserted at the beginning
# (default="").
#
# [<URL-SPEC>] anchor-requested-end = (HTML code)
# Anchors (links) in the spooled page that have been requested for
# download are to have the specified HTML inserted at the end
# (default="").
#
# [<URL-SPEC>] anchor-not-cached-begin = (HTML code)
# Anchors (links) in the spooled page that are not in the cache or
# requested are to have the specified HTML inserted at the beginning
# (default="").
#
# [<URL-SPEC>] anchor-not-cached-end = (HTML code)
# Anchors (links) in the spooled page that are not in the cache or
# requested are to have the specified HTML inserted at the end
# (default="").
#
# [<URL-SPEC>] disable-script = yes | no
# Removes all scripts and scripted events (default=no).
#
# [<URL-SPEC>] disable-blink = yes | no
# Removes the <blink> tag (default=no).
#
# [<URL-SPEC>] disable-meta-refresh = yes | no
# Removes any meta tags in the HTML header that re-direct the browser to
# change to another page after an optional delay (default=no).
#
# [<URL-SPEC>] disable-meta-refresh-self = yes | no
# Removes any meta tags in the HTML header that re-direct the browser to
# reload the same page after a delay (default=no).
#
# [<URL-SPEC>] demoronise-ms-chars = yes | no
# Replaces strange characters that some Microsoft Applications put into
# HTML with character equivalents that most browsers can display
# (default=no). The idea for this comes from the public domain
# Demoroniser perl script.
#
# [<URL-SPEC>] disable-animated-gif = yes | no
# Disables the animation in animated GIF files (default=no).
#
# [<URL-SPEC>] enable-modify-online = yes | no
# Enable the modifications in this section to take place when online as
# well as when offline (default=no). This will cause the HTML or GIF
# to not appear in the browser until WWWOFFLE has processed it all.
# This still does not apply to pages that are not cached.
#

ModifyHTML
{
enable-modify-html = no

add-cache-info = no

#anchor-cached-begin = <font color="#00B000">
#anchor-cached-end = </font>
#anchor-requested-begin = <font color="#B0B000">
#anchor-requested-end = </font>
#anchor-not-cached-begin = <font color="#B00000">
#anchor-not-cached-end = </font>

disable-script = no
disable-blink = no

disable-meta-refresh = no
disable-meta-refresh-self = no

demoronise-ms-chars = no

disable-animated-gif = no

enable-modify-online = no
}


#
# LocalHost
# ---------
#
# A list of hostnames that the host running the WWWOFFLE server may be known by.
# This is so that the proxy does not need to contact itself if the request has a
# different name for the same server.
#
# (host)
# A hostname or IP address that in connection with the port number (in
# the StartUp section) specifies the WWWOFFLE proxy HTTP server. The
# hostnames must match exactly, it is not a WILDCARD match. The first
# named host is used as the server name for several features so must be
# a name that will work from any client host on the network. None of
# the hosts named here are cached or fetched via a proxy.
#

LocalHost
{
localhost
127.0.0.1

#### Example ####
# The server is on www.foo.com, with IP address 11.22.33.44.
# www.foo.com
# 11.22.33.44
}


#
# LocalNet
# --------
#
# A list of hostnames whose web servers are always accessible even when offline
# and are not to be cached by WWWOFFLE because they are on a local network.
#
# (host)
# A hostname or IP address that is always available and is not to be
# cached by WWWOFFLE. The host name matching uses WILDCARDs. A host
# can be excluded by appending a '!' to the start of the name, all
# possible aliases and IP addresses for the host are also required. All
# entries here are assumed to be reachable even when offline. None of
# the hosts named here are cached or fetched via a proxy.
#

LocalNet
{

#### Example ####
# The local domain is foo.com so don't cache any hosts in it.
# *.foo.com
}


#
# AllowedConnectHosts
# -------------------
#
# A list of client hostnames that are allowed to connect to the server.
#
# (host)
# A hostname or IP address that is allowed to connect to the server.
# The host name matching uses WILDCARDs. A host can be excluded by
# appending a '!' to the start of the name, all possible aliases and IP
# addresses for the host are also required. All of the hosts named in
# LocalHost are also allowed to connect.
#

AllowedConnectHosts
{

#### Example ####
# Only allow connections from hosts in the foo.com domain.
# *.foo.com
*.*.*
*
}


#
# AllowedConnectUsers
# -------------------
#
# A list of the users that are allowed to connect to the server and their
# passwords.
#
# (username):(password)
# The username and password of the users that are allowed to connect to
# the server. If this section is left empty then no user authentication
# is done. The username and password are both stored in plaintext
# format. This requires the use of browsers that handle the HTTP/1.1
# proxy authentication standard.
#

AllowedConnectUsers
{

#### Example ####
# Only allow connections from this one user.
# andrew:password
#user:user
}


#
# DontCache
# ---------
#
# A list of URLs that are not to be cached by WWWOFFLE.
#
# [!]URL-SPECIFICATION
# Do not cache any URLs that match this. The URL-SPECIFICATION can be
# negated to allow matches to be cached. The URLs will not be requested
# if offline.
#

DontCache
{

#### Example ####
# Don't cache any hosts in the barfoo.com domain.
# *://*.barfoo.com/
# Don't cache any gzipped or tar files.
# *://*/*.gz
# *://*/*.tar
# Don't cache any files from /volatile in the foo.com domain.
# *://*.foo.com/volatile/*
}


#
# DontGet
# -------
#
# A list of URLs that are not to be got by WWWOFFLE (because they contain only
# junk adverts for example).
#
# [!]URL-SPECIFICATION
# Do not get any URLs that match this. The URL-SPECIFICATION can be
# negated to allow matches to be got.
#
# [<URL-SPEC>] replacement = (URL)
# The URL to use to replace any URLs that match the URL-SPECIFICATIONs
# instead of using the standard error message (default=none). The URL
# /local/images/trans-1x1.gif is a suggested replacement (a 1x1 pixel
# transparent gif).
#
# <URL-SPEC> get-recursive = yes | no
# Choose whether to get URLs that match this when doing a recursive
# fetch (default=yes).
#
# <URL-SPEC> location-error = yes | no
# When a URL reply contains a 'Location' header that redirects to a URL
# that is not got (specified in this section) then the reply is modified
# to be an error message instead (default=no). This will stop ISP
# proxies from redirecting users to adverts if the advert URLs are
# in this section.
#

DontGet
{

#replacement = /local/images/trans-1x1.gif

#### Example ####
# Don't get from any hosts in the barfoo.com domain.
# *://*.barfoo.com/
# Don't get any gzipped or tar files.
# *://*/*.gz
# *://*/*.tar
# Don't get any files from /adverts in the foo.com domain.
# *://*.foo.com/adverts*
# Dont get any gzipped or tar files when getting recursively.
# <*://*/*.gz> get-recursive = no
# <*://*/*.tar> get-recursive = no
}


#
# ######Header
# ------------
#
# A list of HTTP header lines that are to be removed from the requests sent to
# web servers and the replies that come back from them.
#
# [<URL-SPEC>] (header) = yes | no | (string)
# A header field name (e.g. From, Cookie, Set-Cookie User-Agent) and the
# string to replace the header value with (default=no). The header is
# case sensitive, and does not have a ':' at the end. The value of "no"
# means that the header is unmodified, "yes" or no string can be used to
# remove the header or a string can be used to replace the header. This
# only replaces headers it finds, it does not add any new ones.
#
# [<URL-SPEC>] referer-self = yes | no
# Sets the Referer header to the same as the URL being requested
# (default = no).
#
# [<URL-SPEC>] referer-self-dir = yes | no
# Sets the Referer header to the directory name of the URL being
# requested (default = no). This option takes precedence over
# referer-self.
#

######Header
{

### Example ###
# Don't send the username.
# From = yes
# Don't accept Cookies
# Set-Cookie = yes
# Don't send Cookies back
# Cookie = yes
# Lie about the Browser type.
# User-Agent = WWWOFFLE/2.5
}


#
# FTPOptions
# ----------
#
# Options to use when fetching files using the ftp protocol.
#
# anon-username = (string)
# The username to use for anonymous ftp (default=anonymous).
#
# anon-password = (string)
# The password to use for anonymous ftp (default determined at run
# time). If using a firewall then this may contain a value that is not
# valid to the FTP server and may need to be set to a different value.
#
# <URL-SPEC> auth-username = (string)
# The username to use on a host instead of the default anonymous
# username.
#
# <URL-SPEC> auth-password = (string)
# The password to use on a host instead of the default anonymous
# password.
#

FTPOptions
{
anon-username = anonymous
#anon-password =
}


#
# MIMETypes
# ---------
#
# MIME Types to use when serving files that were not fetched using HTTP or for
# files on the built-in web-server.
#
# default = (mime-type)/(subtype)
# The default MIME type (default=text/plain).
#
# .(file-ext) = (mime-type)/(subtype)
# The MIME type to associate with a file extension. The '.' must be
# included in the file extension. If more than one extension matches
# then the longest one is used.
#

MIMETypes
{
default = text/plain

.Z = application/x-compress
.au = audio/basic
.avi = video/x-msvideo
.class = application/java
.cpio = application/x-cpio
.css = text/css
.deb = application/octet-stream
.dtd = application/xml
.dvi = application/x-dvi
.eps = application/postscript
.gif = image/gif
.gz = application/x-gzip
.htm = text/html
.html = text/html
.jpeg = image/jpeg
.jpg = image/jpeg
.js = application/x-javascript
.latex = application/x-latex
.man = application/x-troff-man
.me = application/x-troff-me
.mov = video/quicktime
.mpeg = video/mpeg
.mpg = video/mpeg
.ms = application/x-troff-ms
.pac = application/x-ns-proxy-autoconfig
.pbm = image/x-portable-bitmap
.pdf = application/pdf
.pgm = image/x-portable-graymap
.png = image/png
.pnm = image/x-portable-anymap
.ppm = image/x-portable-pixmap
.ps = application/postscript
.ras = image/x-cmu-raster
.rgb = image/x-rgb
.rpm = application/octet-stream
.rtf = application/rtf
.snd = audio/basic
.tar = application/x-tar
.tcl = application/x-tcl
.tex = application/x-tex
.texi = application/x-texinfo
.texinfo = application/x-texinfo
.tif = image/tiff
.tiff = image/tiff
.tr = application/x-troff
.txt = text/plain
.vr = model/vrml
.wav = audio/x-wav
.wrl = model/vrml
.xbm = image/x-xbitmap
.xml = application/xml
.xpm = image/x-xpixmap
.xwd = image/x-xwindowdump
.zip = application/zip
}


#
# Proxy
# -----
#
# This contains the names of the HTTP (or other) proxies to use external to the
# WWWOFFLE server machine.
#
# [<URL-SPEC>] proxy = (host[:port])
# The hostname and port on it to use as the proxy.
#
# <URL-SPEC> auth-username = (string)
# The username to use on a proxy host to authenticate WWWOFFLE to it.
# The URL-SPEC in this case refers to the proxy and not the URL being
# retrieved.
#
# <URL-SPEC> auth-password = (string)
# The password to use on a proxy host to authenticate WWWOFFLE to it.
# The URL-SPEC in this case refers to the proxy and not the URL being
# retrieved.
#
# [<URL-SPEC>] ssl = (host[:port])
# A proxy server that should be used for Secure Socket Layer (SSL)
# connections e.g. https. Note that for the <URL-SPEC> that only the host
# is checked and that the other parts must be '*' wildcards.
#

Proxy
{
<http://*> proxy = none

#### Example ####
# Use www.foo.com as a default http proxy server on port 8080
# Except for the foo.com domain which has no proxy.
# <http://*> proxy = www.foo.com:8080
# <*://foo.com> proxy = none
}


#
# Alias
# -----
#
# A list of aliases that are used to replace the server name and path with
# another server name and path. Also for servers known by two names.
#
# URL-SPECIFICATION = URL-SPECIFICATION
# Any requests that match the first URL-SPECIFICATION are replaced by
# the second URL-SPECIFICATION. The URL-SPECIFICATIONs must match
# exactly, it is not a WILDCARD match, the URL arguments are ignored.
#

Alias
{

#### Example ####
# The http server www.bar.com is mirrored locally at www.bar-mirror.foo.com
# http://www.bar.com/ = http://www.bar-mirror.foo.com/
# The wwwoffle homepage can be aliased
# http://wwwoffle/ = http://www.gedanken.demon.co.uk/wwwoffle/
}


#
# Purge
# -----
#
# The method to determine which pages to purge, the default age the host
# specific maximum age of the pages in days, and the maximum cache size.
#
# use-mtime = yes | no
# The method to use to decide which files to purge, last access time
# (atime) or last modification time (mtime) (default=no).
#
# max-size = (size)
# The maximum size for the cache in MB after purging (default=0). A
# maximum cache size of 0 means there is no limit to the size. If this
# and the min-free options are both used the smaller cache size is
# chosen. This option take into account the URLs that are never purged
# when measuring the cache size but will not purge them.
#
# min-free = (size)
# The minimum amount of free disk space in MB after purging (default=0).
# A minimum disk free of 0 means there is no limit to the free space.
# If this and the max-size options are both used the smaller cache size
# is chosen. This option take into account the URLs that are never
# purged when measuring the cache size but will not purge them.
#
# use-url = yes | no
# If true then use the URL to decide on the purge age, otherwise use the
# protocol and host only (default=no).
#
# del-dontget = yes | no
# If true then delete the URLs that match the entries in the DontGet
# section (default=no).
#
# del-dontcache = yes | no
# If true then delete the URLs that match the entries in the DontCache
# section (default=no).
#
# [<URL-SPEC>] age = (age)
# The maximum age in the cache for URLs that match this (default=14).
# An age of zero means not to keep, negative not to delete. The
# URL-SPECIFICATION matches only the protocol and host unless use-url is
# set to true. Longer times can be specified with a 'w', 'm' or 'y'
# suffix for weeks, months or years (e.g. 2w=14).
#

Purge
{
use-mtime = no

max-size = 0
min-free = 0

use-url = no

del-dontget = yes
del-dontcache = yes

age = 4w

#### Example ####
# Expire hosts in the domain foo.com at 1 week except bar.foo.com at 2 weeks.
# <*://foo.com/> age = 1w
# <*://bar.foo.com/> age = 2w
# Never keep anything in the domain bar.com except foo.bar.com is always kept.
# <*://bar.com/> age = 0
# <*://foo.bar.com/> age = -1
#
# Keep ftp files for 7 days and http for 14.
# <ftp://*> age = 7
# <http://*> age = 14
#
# Purge files to keep the cache below 100 MB
# max-size = 100
}


#
# WILDCARD
# --------
#
# A wildcard match is one that uses the '*' character to represent any group of
# characters.
#
# This is basically the same as the command line file matching expressions in
# DOS or the UNIX shell, except that the '*' can match the '/' character. A
# maximum of 2 '*' characters can be used in any wildcard.
#
# For example
#
# *.gif matches foo.gif and bar.gif
# *.foo.com matches www.foo.com and ftp.foo.com
# /foo/* matches /foo/bar.html and /foo/bar/foobar.html
#
#
# URL-SPECIFICATION
# -----------------
#
# When specifying a host and protocol and pathname in many of the sections a
# URL-SPECIFICATION can be used, this is a way of recognising a URL.
#
# For the purposes of this explanation a URL is considered to be made up of five
# parts.
#
# proto The protocol that is used (e.g. 'http', 'ftp')
# host The server hostname (e.g. 'www.gedanken.demon.co.uk').
# port The port number on the host (e.g. default of 80 for HTTP).
# path The pathname on the host (e.g. '/bar.html') or a directory name
# (e.g. '/foo/').
# args Optional arguments with the URL used for CGI scripts etc.
# (e.g. 'search=foo').
#
# For example the WWWOFFLE homepage: http://www.gedanken.demon.co.uk/wwwoffle/
# The protocol is 'http', the host is 'www.gedanken.demon.co.uk', the port is
# the default (in this case 80), and the pathname is '/wwwoffle/'.
#
# In general this is written as (proto)://(host)[:(port)]/(path)[?(args)]
#
# Where [] indicates an optional feature, and () indicate a user supplied name
# or number.
#
# Some example URL-SPECIFICATION options are the following:
#
# *://* Any protocol, Any host, Any port, Any path, Any args
# (This is that same as saying 'default').
#
# *://*/(path) Any protocol, Any host, Any port, Named path, Any args
#
# *://*/*.(ext) Any protocol, Any host, Any port, Named path, Any args
#
# *://*/*? Any protocol, Any host, Any port, Any path, No args
#
# *://(path)?* Any protocol, Any host, Any port, Named path, Any args
#
# *://(host) Any protocol, Named host, Any port, Any path, Any args
#
# (proto):// Named protocol, Any host, Any port, Any path, Any args
#
# (proto)://(host) Named protocol, Named host, Any port, Any path, Any args
#
# (proto)://(host): Named protocol, Named host, Default port, Any path Any args
#
# *://(host):(port) Any protocol, Named host, Named port, Any path, Any args
#
# The matching of the host, the path and the args use the wildcard matching that
# is described above.
#
#
# In some sections that accept URL-SPECIFICATIONs they can be negated by
# appending the '!' character to the start. This will mean that the comparison
# of a URL with the URL-SPECIFICATION will return the logically opposite value
# to what would be returned without the '!'. If all of the URL-SPECIFICATIONs
# in a section are negated and '*://*/*' is added to the end then the sense of
# the whole section is negated.
#
 

Beiträge vom vorherigen Thema anzeigen:   
     Pro-Linux Foren-Übersicht -> Netzwerk Alle Zeiten sind GMT + 1 Stunde
Seite 1 von 1

 
Gehen Sie zu:  

Powered by phpBB © phpBB Group
pro_linux Theme © 2004 by Mandaxy