unix / linux

  • Update Ubuntu on DigitalOcean via Command Line

    Update Ubuntu on DigitalOcean via Command Line

    Overview

    It’s essential to keep your DigitalOcean Droplet running the latest version of Ubuntu. Not only does this ensure your server’s security, but it also guarantees compatibility with the latest software and tools. Here’s a simplified yet comprehensive guide to updating Ubuntu using command line prompts.

    Step 1 – Secure Login

    First, ensure you’re logged into your Droplet. If you’ve set up OpenSSH, logging in is hassle-free and doesn’t require a password. Use the following command:

    root@your_droplet_ip

    Step 2: Initiate the Update

    Start with the sudo apt update command. This command prompts Ubuntu’s primary package manager to check for available updates across your software repositories.

    Step 3: Apply the Updates

    After the initial update, use sudo apt upgrade to apply the available package updates. This step might prompt you to confirm the installations. Simply enter 'y' or 'yes' to proceed.

    Tackling Upgrade Issues Occasionally, some packages may resist upgrading. In such cases, the full-upgrade option is your go-to solution. It resolves most upgrade conflicts by allowing necessary package removals.

    Combining Commands for Efficiency

    For a more efficient workflow, you can combine the update and upgrade commands using the && operator. Adding the -y flag bypasses the confirmation step, automating the process:

    sudo apt update && sudo apt upgrade -y

    Understanding apt vs apt-get

    As a web developer, you might come across two commands: apt and apt-get. Both serve similar functions, but apt is more user-friendly and provides progress feedback. While apt-get is still widely used in scripts due to its stability and predictability, for regular maintenance tasks, apt is sufficient and recommended.

    Final Thoughts

    Regular updates are a vital part of server management, ensuring security and optimal performance. This guide aims to make the update process on DigitalOcean as smooth as possible, especially for those who juggle multiple development tasks. Remember, keeping your server updated is not just a best practice; it’s a necessity in the fast-evolving world of web development.

  • Secure WordPress File Permissions: A Developer’s Guide to Hardening Your Site

    Secure WordPress File Permissions: A Developer’s Guide to Hardening Your Site

    Overview

    As web developers, we know that the flexibility of WordPress is partly due to its ability to allow certain files to be writable by the web server. This functionality is a double-edged sword because while it enables dynamic features, it also opens up security risks, especially on shared hosting platforms.

    To safeguard your WordPress site, it’s crucial to implement stringent file permissions and only relax them temporarily when necessary—for instance, when the site requires write access for certain operations or when handling file uploads.

    Recommended File Permission Practices

    • Ownership: All files should be owned by your web directory user account, with write permissions set for the same web directory user. Files that require WordPress to write should be accessible by the web server. On some hosting setups, this may mean the files need to be group-owned by the web server’s user account.
    • Root Directory (/): All files here should be writable by you alone, except for .htaccess if you allow WordPress to generate rewrite rules automatically.
    • WordPress Administration Area (/wp-admin/): All files should be writable by the web directory user account alone.
    • WordPress Core Libraries (/wp-includes/): All files should be writable by the web directory user account alone.
    • User Content Area (/wp-content/): This directory is meant to be writable by both the web directory user account and the web server process.
    • Themes (/wp-content/themes/): If using the theme editor, these files need to be writable by the web server process. Otherwise, they should be writable by the web directory user account alone.
    • Plugins (/wp-content/plugins/): All plugin files should be writable by the web directory user account alone.

    Any additional directories within /wp-content/ should follow the guidelines provided by the respective plugin or theme.

    Modifying File Permissions

    For those with shell access, file permissions can be adjusted recursively with these commands:

    • Directories: find /path/to/your/wordpress/install/ -type d -exec chmod 755 {} \;
    • Files: find /path/to/your/wordpress/install/ -type f -exec chmod 644 {} \;
    Secure WordPress File Permissions Screenshot

    Automatic Updates and File Permissions

    When WordPress is instructed to perform an automatic update, it carries out file operations under the file owner’s user account, not the web server’s. All files default to 0644 and directories to 0755, making them writable by the owner and readable by others, including the web server.

  • Location of Clam AV Configuration file

    /etc/cron.daily/clamscan
  • Location of Google PageSpeed Module on CentOS

    /usr/local/apache/conf/pagespeed.conf
  • Check all PHP files in your site that have been modified in the last 30 days

    This will check all PHP files in your site that have been modified in the last 30 days. Just be sure to replace /path/to/your-site with the actual path to your site as you probably can imagine. You can also change php to a different file extension to search more thoroughly.

    find /path/to/your-site -type f -name "*.php" -ctime -30

    Source: http://premium.wpmudev.org/blog/removing-backdoor-exploits/

  • Find backup folders

    find /home/ -type d -name "*_bak" -ls > /home/bakfolders.txt
  • Find large files and folders on server

    du -h --exclude virtfs /| grep ^[0-9.]*G | sort -nr > /home/diskspace.txt
  • Rsync files and directory from local server to a remote server

    # rsync -avz xyz_folder/ user@xyz_server.com:/xyz_folder/
  • How to recursively count files in a linux directory?

    I want to count the number of files in a directory and it´s subdirectories

    find DIR_NAME -type f -print | wc -l

    The f in -type f stands for files and wc -l for word count lines. Remove the -type f to include directories in the count

    Source: http://stackoverflow.com/questions/9157138/how-to-recursively-count-files-in-a-linux-directory

  • Checking Free Disk Space

    Both Linux and UNIX offers two commands for checking out free disk space:

    (a) df command : Report file system disk space usage

    (b) du command : Estimate file space usage

    df command examples – to check free disk space

    Type df -h or df -k to list free disk space:

    $ df -h

    OR

    $ df -k

    Source: http://www.cyberciti.biz/faq/check-free-space/

  • How Do I Find The Largest Top 10 Files and Directories

    Type the following command at the shell prompt to find out top 10 largest file/directories:

    cd /path/to/some/where
    du -hsx * | sort -rh | head -10

    Source: http://www.cyberciti.biz/faq/how-do-i-find-the-largest-filesdirectories-on-a-linuxunixbsd-filesystem/

  • Start Command In Background

    You can put a task (such as command or script) in a background by appending a & at the end of the command line. The & operator puts command in the background and free up your terminal. The command which runs in background is called a job. You can type other command while background command is running. The syntax is:

    /path/to/command arg1 arg2 &

    Example:

    $ ls *.py > output.txt &

    Source: http://www.cyberciti.biz/faq/linux-command-line-run-in-background/

  • chmod directory only or file only

    Chmod just directories

    find -type d -print0 |xargs -0 chmod 755

    Chmod just files

    find -type f -print0 |xargs -0 chmod 644

    Source: http://forums.whirlpool.net.au/archive/323231

  • HowTo Empty Directory

    To delete all files from /tmp/bar/ directory (including all files from subdirectories such as /tmp/bar/dir1), enter:

    $ cd /tmp/bar/ 
    $ find . -type f -delete
  • Synchronizing folders with rsync

    Update the contents of a folder

    In order to save bandwidth and time, we can avoid copying the files that we already have in the destination folder that have not been modified in the source folder. To do this, we can add the parameter -u to rsync, this will synchronize the destination folder with the source folder, this is where the delta-transfer algorithm enters. To synchronize two folders like this we use:

    rsync -rtvu source_folder/ destination_folder/

    By default, rsync will take into consideration the date of modification of the file and the size of the file to decide whether the file or part of it needs to be transferred or if the file can be left alone, but we can instead use a hash to decide whether the file is different or not. To do this we need to use the -c parameter, which will perform a checksum in the files to be transferred. This will skip any file where the checksum coincides.

    rsync -rtvuc source_folder/ destination_folder/

    Synchronizing two folders with rsync

    To keep two folders in synchrony, not only do we need to add the new files in the source folder to the destination folder, as in the past topics, we also need to remove the files that are deleted in the source folder from the destination folder. rsync allow us to do this with the parameter --delete, this used in conjunction with the previously explained parameter -u that updates modified files allow us to keep two directories in synchrony while saving bandwidth.

    rsync -rtvu --delete source_folder/ destination_folder/

    The deletion process can take place in different moments of the transfer by adding some additional parameters:

    • rsync can look for missing files and delete them before it does the transfer process, this is the default behavior and can be set with --delete-before
    • rsync can look for missing files after the transfer is completed, with the parameter --delete-after
    • rsync can delete the files done during the transfer, when a file is found to be missing, it is deleted at that moment, we enable this behavior with --delete-during
    • rsync can do the transfer and find the missing files during this process, but instead of delete the files during this process, waits until it is finished and delete the files it found afterwards, this can be accomplished with --delete-delay

    Excluding files and directories

    There are many cases in which we need to exclude certain files and directories from rsync, a common case is when we synchronize a local project with a remote repository or even with the live site, in this case we may want to exclude some development directories and some hidden files from being transfered over to the live site. Excluding files can be done with –exclude followed by the directory or the file that we want to exclude. The source folder or the destination folder can be a local folder or a remote folder as explained in the previous section.

    rsync -rtv --exclude 'directory' source_folder/ destination_folder/
    rsync -rtv --exclude 'file.txt' source_folder/ destination_folder/
    rsync -rtv --exclude 'path/to/directory' source_folder/ destination_folder/
    rsync -rtv --exclude 'path/to/file.txt' source_folder/ destination_folder/

    Source: http://www.jveweb.net/en/archives/2010/11/synchronizing-folders-with-rsync.html

  • Find files with 777 permission

    find /nas/content/live/lifeofaserver/vhosts/ -perm 777

    … or to same the results in a file:

    find /nas/content/live/lifeofaserver/vhosts/ -perm 777 > /root/files.txt

    Also, if you want to only see the files with some permission you can use:

    find /nas/content/live/lifeofaserver/vhosts/ -perm 777 -type f

    …. or directories:

    find /nas/content/live/lifeofaserver/vhosts/ -perm 777 -type d
  • Possible options to exclude files/directories from backup using tar

    Exclude files using multiple patterns

    tar -czf backup.tar.gz --exclude=PATTERN1 --exclude=PATTERN2 ... /path/to/backup

    Exclude files using an exclude file filled with a list of patterns

    tar -czf backup.tar.gz -X /path/to/exclude.txt /path/to/backup

    Exclude files using tags by placing a tag file in any directory that should be skipped

    tar -czf backup.tar.gz --exclude-tag-all=exclude.tag /path/to/backup
    

    Use a wild card in the --exclude option argument.

    For example, suppose you have 3 directories a, b & c.

    Each directory has 3 files.

    • a: d e f
    • b: g h i
    • c: j k l

    To exclude all of the files in b, use --exclude 'b/*'

    To also exclude the directory b, use --exclude 'b/*' --exclude 'b'

  • How to Disable Password Authentication for SSH

    Once you have SSH Keys configured, you can add some extra security to your server by disabling password authentication for SSH. (Note that if you do lose your private key, this will make the server inaccessible and you will need to contact HostGator to have this re-enabled.)

    To disable this setting, you can do the following:

    vi /etc/ssh/sshd_config

    In this file, set the following settings to the following values. If these settings are already in the file, set them to “no” rather than add new lines.

    ChallengeResponseAuthentication no
    PasswordAuthentication no
    UsePAM no

    Once this is done, restart the SSH daemon to apply the settings.

    /etc/init.d/sshd restart

    Source: http://support.hostgator.com/articles/specialized-help/technical/how-to-disable-password-authentication-for-ssh

  • How Do I Display the Contents of a Linux File?

    $ cat filename

    The cat command (short for concatenate) shown in this example is like the DOS type command. In response to the command cat pig_info, the system simply splatters the file on the screen. If there is more data in the file than will fit on one screen, the contents whiz by before you can see it. The more command solves this problem by displaying the file screen by screen:

    $ more filename

    Source: http://lowfatlinux.com/linux-display-files-cat.html

  • Searching for files over 10MB in size

    NOTE:

    To adjust the size of your search replace +10000k with the size you desired, such as the following:

    • 50MB: +50000k
    • 100MB: +100000k
    • 500MB: +500000k
    find / -type f -size +10000k -exec ls -lh {} \; | awk '{ print $5 ": " $9 }' |sort -n
  • Edit file in gedit

    sudo gedit /some/path/filename.php
  • Easily find or search files or directories

    To search the current directory and all subdirectories for a folder or file, use the following command:

    find . -name filename

    Source: http://www.tech-recipes.com/rx/976/unixlinux-easily-find-or-search-files-or-directories/

  • How to search for a string in a selection of files

    find . -exec grep "www.athabasca" '{}' \; -print

    This command will search in the current directory and all sub directories. All files that contain the string will have their path printed to standard output.

    Source: http://www.athabascau.ca/html/depts/compserv/webunit/HOWTO/find.htm

  • kill do_backup process for 4PSA Total Plesk Backup

    To debug kill the do_backup process:

    killall -9 do_backup

    and run it with debugging:

    /usr/local/tbackup/do_backup -d

    You should see exactly what happens.

  • Shell du command tip – estimate file space usage and exclude particular files

    The du command estimate file space usage and summarize disk usage of each FILE, recursively for directories.

    It displays the file system block usage for each file argument and for each directory in the file hierarchy rooted in each direc tory argument. If no file is specified it will use current directory.

    du command Examples

    Type du to display usage in current directory :

    $ du

    Pass -h option to display output in Byte, Kilobyte, Megabyte, Gigabyte, Terabyte and Petabyte (Human-readable output)

    $ du -h

    Display the name and size of each png file in the /ramdisk/figs/ directory as well as a total for all of the pngs:

    $ du -hc /ramdisk/figs/*.png

    Another useful option is -c which produce a grand total:

    $ du -c

    Show the disk usage of the /home/vivek subdirectory:

    $ du /home/vivek

    Show only a summary of the disk usage of the /home/vivek

    $ du -hs /home/vivek

    Exclude files that match PATTERN. For example do not count *.obj or *.jpg files:

    $ du -h --exclude='*.obj'
    $ du -h --exclude='*.jpg'

    A PATTERN is a shell pattern (not a regular perl or other expression). The pattern ? matches any one character, whereas * matches any string.

    Pipes and filters with du

    Now display everything sorted by filesize:

    $ du -sk .[A-z]* *| sort -n

    Display screenful output at a time as du generate more output than can fit on the console / screen:

    $ du -h | less

    To find top 3 directories, enter :

    $ du -sk * | sort -nr | head -3
    4620348 var
    651972  home
    27896   usr
    21384   lib64

    Working without du

    Finally here is one liner (without du command) that prints top 10 filesize in Mb (thanks to dreyser for submitting idea):

    # find /var -type f | xargs ls -s | sort -rn | awk '{size=$1/1024; printf("%dMb %s\n", size,$2);}' | head

    Output:

    31Mb /var/crash/_usr_lib_firefox_firefox-bin.1000.crash
    22Mb /var/cache/apt/archives/linux-image-2.6.20-16-generic_2.6.20-16.28_i386.deb
    16Mb /var/lib/apt/lists/in.archive.ubuntu.com_ubuntu_dists_feisty_universe_binary-i386_Packages
    15Mb /var/cache/apt/archives/linux-restricted-modules-2.6.20-16-generic_2.6.20.5-16.28_i386.deb
    9Mb /var/cache/apt/srcpkgcache.bin
    9Mb /var/cache/apt/pkgcache.bin
    8Mb /var/cache/apt/archives/firefox_2.0.0.4+1-0ubuntu1_i386.deb
    7Mb /var/cache/apt/archives/linux-headers-2.6.20-16_2.6.20-16.28_i386.deb
    5Mb /var/lib/apt/lists/archive.ubuntu.com_ubuntu_dists_feisty_main_binary-i386_Packages
    5Mb /var/lib/apt/lists/in.archive.ubuntu.com_ubuntu_dists_feisty_universe_source_Sources