Skip site navigation (1)Skip section navigation (2)
Date:      Tue, 27 May 2003 08:22:49 -0400
From:      "Dave [Nexus]" <dave@nexusinternetsolutions.net>
To:        <freebsd-questions@freebsd.org>
Subject:   shell/awk scripting help - parsing directories to gain user/file information for commands
Message-ID:  <DBEIKNMKGOBGNDHAAKGNOEBEPEAB.dave@nexusinternetsolutions.net>

next in thread | raw e-mail | index | archive | help
have a number of uses for this where I am trying to get away from maintaining
lengthy static files which contain all the statically entered commands to run
for a cron job...

for example;

- need to run webalizer of a number of user websites and directories
- can run webalizer without a customized conf file, but need to provide
hostname, outputdir and other such variables from the command line
- can list all the log files which give the appropriate information

# ls /www/*/logs/*.access_log

generates...

/www/user1/.logs/user1domain1.com.access_log
/www/user1/.logs/user1domain2.com.access_log
/www/user2/.logs/user2domain1.com.access_log
/www/user2/.logs/user2domain2.com.access_log
/www/user3/.logs/user3domain1.com.access_log
/www/user3/.logs/user3domain2.com.access_log

what I am trying to script is something that does;

<pseudo code>
for i in /www/*/logs/*.access_log;
	ereg (user)(domain_name) from $i;
	do webalizer -n $domain_name -o /www/$user/stats/;
done
</pseudo code>

...as this would eliminate human error in maintaining a file which contains the
appropriate lines to handle this via cron or something every night.

This is one example, there are a slew of other similar applications that I would
use this for. have played with awk as well, just can't wrap my head around this
(not a shell scripting person by trade).

Any guidance or insight would be appreciated.

Dave




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?DBEIKNMKGOBGNDHAAKGNOEBEPEAB.dave>