Zenoss: Add a device to a Group with ZenDMD
As user zenoss
(or whatever user your Zenoss application runs as), run zendmd
, and:
Device = find('My Awesome Server')
Device.addDeviceGroup('My Wonderful Group')
commit()
This will make the device named My Awesome Server
a member of the group named My Wonderful Group
.
Συντομεύσεις πληκτρολογίου σε συσκευές Cisco
Συντόμευση | Περιγραφή |
---|---|
CTRL+P | [previous] εμφάνιση προηγούμενης εντολής |
CTRL+N | [next] εμφάνιση επόμενης εντολής |
CTRL+B | [back] μετακίνηση του δρομέα κατά ένα χαρακτήρα προς τα πίσω |
CTRL+F | [forward] μετακίνηση του δρομέα κατά ένα χαρακτήρα προς τα μπροστά |
ESC+B | [back] μετακίνηση του δρομέα κατά μία λέξη προς τα πίσω |
ESC+F | [forward] μετακίνηση του δρομέα κατά μία λέξη μπροστά |
CTRL+A | μετακίνηση του δρομέα στην αρχή της εντολής |
CTRL+E | [end] μετακίνηση του δρομέα στο τέλος της εντολής |
CTRL+D | [delete] διαγραφή του επιλεγμένου χαρακτήρα |
BACKSPACE | διαγραφή του προηγούμενου από τον επιλεγμένο χαρακτήρα |
CTRL+W | [word] διαγραφή μιας λέξης |
CTRL+U | διαγραφή ολόκληρης της γραμμής |
Mozilla Thunderbird
Connect the Thunderbird email client to your Exchange server, tested, slow, awkward... Folders other than Inbox don't get highlighted automatically when new emails come inside them.
Send Apache logs to remote syslog server
This document briefly describes how to send the logs from Apache to both a remote server, as well as log them locally. There are benefits to both approaches -local and remote logging- and with this method you will have them both.
Two types of logs are described, the CustomLog
and the ErrorLog
.
Also, my preferred method of shipping logs is with nc
, due to its
simplicity.
CustomLog
The CustomLog
is provided by the log_config
module, which is
included in the default installation of Apache (at least on CentOS 6.5
and on Ubuntu 12.10 on which I tested). This module is flexible enough
to allow multiple CustomLog
directives, so to get Apache to log in
both local files and to a remot syslog server you can use two lines:
CustomLog logs/access_log combined
CustomLog "| nc -u -j syslog.example.com 514" combined
You will need to adjust syslog.example.com
to your syslog server,
and possibly 514
to whatever port your server is listening to, if
it's not the default 514.
ErrorLog
The ErrorLog
is provided by the core
module, and unfortunately it's
not as flexible as CustomLog
, so it doesn't allow multiple ErrorLog
directives. If you add more than one, only the one that occures last in
the configuration file will be used. Luckily, you can utilize the power
of tee
to overcome this:
ErrorLog "| tee -a /var/log/httpd/error_log | nc -u -j syslog.example.com 514"
See also
- Sending web logs to Computer Security
from Fermilab, source of the
tee
tip. - Apache HTTPD CustomLog to Syslog via UDP,
source of the
nc -u -j
tip.
Create an encrypted directory on Ubuntu
These are very brief instructions on how to create an encrypted directory on an Ubuntu system. A usage scenario for this, is that you can keep your files encrypted while stored on a cloud storage service, and decrypt them on-demand only when necessary.
Install encfs:
sudo apt-get install encfs
Link encrypted and decrypted direcories:
encfs ~/.encrypted ~/visible
If the directories don't exist, encfs will ask to create them.
To umount:
fusermount -u ~/visible
To re-mount:
encfs ~/.encrypted ~/visible
Contribute to an open source project on GitHub
This document will guide you through the steps required to contribute some source code fix or enhancement to an open source project hosted on GitHub. It is a bottom-up tutorial, starting from signing up to GitHub.
As a real life example for this guide, we will make a contribution to a not-so-typical GitHub project, the Free Programming Books list. This project in GitHub is the source of content of the webpage List of Free Programming Books.
Sign Up to GitHub
First, you need to have an account on GitHub. There are non-free accounts for non-open-source projects, but for the purpose of this guide, and for contributing to Open Source software, you only need a free account.
Head over to github.com and Sign Up!
Fork a Project
In order to get the code of a project, edit it and send it back to the developers, you need to create a fork of the source code. For the purpose of this example, let's assume that your username is "ChuckNorris" (meh... as if Chuck Norris needs to fork and edit source code... the source code fixes itself to avoid being roundhouse kicked...).
Head over to the page where the project you are interested in is
hosted. For our example, go to
github.com/vhf/free-programming-books,
and click on the Fork button. GitHub will take a few seconds, and
will create a copy of the original repository under your account, in
our example it would be github.com/ChuckNorris/free-programming-books
.
Install Git
You will need git
installed on your PC/workstation to
download code from GitHub. On an Ubuntu (or other Debian-based) system,
do:
sudo apt-get install git
On a CentOS or other RedHat-based system:
yum install git
All of the rest of these instructions in this guide are the same on Ubuntu (or similar) and CentOS (or similar) systems, although Ubuntu typically comes with newer version of packages, and might have better behaviour with Git operations. After you have installed Git, you should set your username and email address. These will be submitted with the changes that you will make, to the original repository:
git config --global user.name "Your Name"
git config --global user.email [email protected]
Download the source code
To download the source code locally on your PC/workstation, from your GitHub repository:
git clone https://github.com/ChuckNorris/free-programming-books.git
This will create a directory named after the project (in our example "free-programming-books"), and put the source code files inside. You can now browse the source code files locally, on your workstation.
Connect the local directory with the original GitHub repository
Now that you downloaded the source code from your own repository, git
knows where to push the changes that you will make. In git
parlance,
your repository is the origin.
You will most probably need to also create a connection to the original GitHub repository, so that Git knows where to pull updates from. In other words, if the developers of the original project do some changes, you should be able to update your directory on your workstation with those updates, so that you always work on the most current version of the code. In Git parlance, the original repository is the upstream.
To add the upstream repository:
cd free-programming-books
git remote add upstream https://github.com/vhf/free-programming-books.git
Now, every time you want to work on the project, you can get an updated version of it by running:
git fetch upstream
This will not change any files that you already edited locally, unless you also do:
git merge upstream/master
Edit source files and push changes
At this point, you should have a copy of the source code that you want to edit on your GitHub repository, the one that you forked from the upstream repository, and is now your origin. You should also have a local copy of the source code on a directory on your workstation.
You probably now want to edit one or more of the source code files,
using the text editor of your preference. For the sake of this example,
we will do a couple of changes in the file named free-programming-books.md
,
and then push them to the origin. The changes that we will do are:
- Add two programming books for the Picolisp language, and
- Add the Picolisp language to the table of contents.
For the record, here is the diff
of the file before and after the changes:
[marios@j free-programming-books]$ diff -u free-programming-books.md-original free-programming-books.md
--- free-programming-books.md-original 2014-01-29 13:10:05.502982051 +0200
+++ free-programming-books.md 2014-01-29 13:06:00.659982149 +0200
@@ -108,6 +108,7 @@
* [PC-BSD](#pc-bsd)
* [Perl](#perl)
* [PHP](#php)
+* [PicoLisp](#picolisp)
* [PostgreSQL](#postgresql)
* [PowerShell](#powershell)
* [Processing](#processing)
@@ -1250,6 +1251,11 @@
* [PHP 5 Power Programming](http://www.informit.com/content/images/013147149X/downloads/013147149X_book.pdf)
+###PicoLisp
+* [PicoLisp by Example](http://www.scribd.com/doc/103733857/PicoLisp-by-Example)
+* [PicoLisp Works](http://www.scribd.com/doc/103732688/PicoLisp-Works)
+
+
###PostgreSQL
* [Practical PostgreSQL](http://www.commandprompt.com/ppbook/)
You can get a result similar to the above with git diff
.
After you edit the source code file that you want, and save it on your
workstation, it's time to push the changed file to your own GitHub
repository, the origin. First, you commit your change to Git.
This only affects files locally on your workstation. In this example,
the file name is free-programming-books.md
:
git commit free-programming-books.md -m 'Added two books on PicoLisp'
git push origin master
The commit
command "gave" the changed file to Git to handle, with a
short comment given after the -m
option, and then the
push
command asked Git to push the committed file to GitHub, to the
master branch of the origin repository. You will be asked for
your username and password to GitHub during the push.
You can visit your GitHub repository now, and you will be able to see that the file you edited and pushed has changed.
Ask the original developers to pull your changes
After you have edited one or more files, committed them to Git and pushed them to your own repository on GitHub, you will probably want to ask the developers of the original source code, the one that resides in the upstream to review your changes, and merge them into the project, if applicable.
This action is called a pull request, and is clearly documented at Using Pull Requests on github.com. For the record, and for the completion of the real-world example used in this guide, the result was Pull Request 672 on the Upstream repository.
Conclusion
The procedure outlined above might seem too much at first, but after a couple of times, you will realize that it's actually pretty simple. If you were to count the commands used here, you would only have a handful. So, go ahead, become an open source developer!
Logging in Linux
- syslogd, previously the de facto standard Linux implementation, used to be the default on Linux distributions.
- klogd
- metalog
Syslog Servers
- rsyslog is a free and open source syslog server, the default on recent Ubuntu and CentOS distributions. Paid-for options include a Windows agent that sends the Event log to an rsyslog server.
- syslog-ng is a free and open source syslog server, with great configuration options. Commercial extra options include encryption and and a web interface.
- LogStash is free and open source, and combines a syslog server with a web interface for searching and graphing.
- Graylog2 is also free and open source, and like LogStash it combines the functionality of a syslog server with that of an interface to search and graph the data.
- Fluentd is a syslog server, capable of scaling up massively.
Web Interfaces
- Kibana is a web interface for logs collected with LogStash or with other data stored in ElasticSearch.
- [Octopussy] is a web interface with searching and graphing features. Installation instructions exist for RedHat- and Debian-based systems.
- LogAnalyzer is a web frontend for syslog, with some analysis and reporting capabilities.
Free and Open Source Log Analysis Software
- LogReport does log analysis and reporting, but it seems that its development has stopped.
- LogSurfer analyzes logs line by line against predefined regular expressions, and can trigger notifications.
- Epylog is a time-based log analysis tool, which sends reports and alerts by emails. It is a replacement of logwatch.
- SEC does log analysis with focus on event correlation.
- ELSA is an analysis and search tool for syslog-ng with MySQL for backend and Sphinx for indexing.
- Clarity is a simple web front end for the contents of a directory with log files, with grep-like and tail-f-like features.
Non-Free or Closed Source or Commercial Log Analysis Software
- LogZilla is free of charge for up to 10 devices and up to 1 million messages per day. Beyond those limits, the price scales up according to the selected features. Documentation includes instructions for RedHat- and Debian-based systems.
- Splunk is free for up to 500 MBytes of data per day. Download options include packages for 2.6+ Kernel Linux distributions.
- CloudPelican is still in development as of this writing. Their website mentions that there is a free version, but downloading the demo requires registration.
- XPOLog is a freeware log analysis software with a standalone web server.
- HP ArcSight Logger is a log analysis commercial solution by HP.
- LogScape is a Linux based log analysis and indexing tool, with a free basic version.
- Sumo Logic cloud based log management and analytics. Free version works for up to 500 MBytes of data per day, up to three users and up to 7 days retention.
- Sawmill is a closed source analysis tool, with free 30-day demo versions.
- Loggly offers log management, analysis and graphing. There is a free version for up to 200 MBytes of data per day and 7-day retention.
- Otus SIEM
- LogRhythm log management, analysis, SIEM, focused on security and forensics.
Analytics and Analysis
- List of web analytics software
- Log Analysis at DMOZ and Log Analysis at Yahoo Directory.
For Windows
- Snare Backlog was a logging server for Windows, that could collect data from several standard sources, as well as from a wide range of operating systems that run its agent. There are freeware versions of it still available for download.
- LogFaces is a syslog server for Windows.
- Log Parser is a search tool for logs and other data sources.
See Also
Google Servers Migration Paper
This has to be the coolest paper I 've read in a long time. It explains how a few engineers working at Google, migrated thousands of servers from the ancient RedHat Linux 7.1 to a modern version of Debian, with minimum downtime, by replacing small bits of the operating system at a time, in many many iterations.
The paper is from a USENIX Conference, here you go, and you're very welcome: Live upgrading thousands of servers from an ancient Red Hat distribution to 10 year newer Debian based one.