Computer Question for Yogi
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Computer Question for Yogi
Hi Yogi
This has to do with GRsync, Mount, & Umount.
A brief history. I was never able to PUT a file from my computer to a Remote Computer. The only way we could get my data on a Remote Computer was to use the Remote Computer to FETCH the data.
E.g. Computer A (my office) TO Computer B (in house) always failed for some reason. But Computer B FROM Computer A always worked without errors.
OK - An IT guy stopped by my house the other day and got everything working just fine. Sorta.
This is from my Linux computers in the office, using GRsync to an external Hard Drive connected to my wife's Windows XP computer in the house. This part was working fine, and still is working, but I have created a new problem.
To start with, in order to get GRsync to work, naturally you have to mount the remote Hard Drive.
Mounting the shared folder from the external hard drive worked like a charm.
The shared folder, 2015OffSiteStorage appeared in the /mnt folder on my computer, so was accessible to GRsync and synced perfectly with the external drive at the house.
I also use GRSync to copy to my local USB hard drive, I've never had a problem with anything local.
HERE is where my new problems began.
I have a 1 terrabyte drive IN a computer up here in my office, using only about 100 gigs for an OS partition. I created 750 gig partition I named DataStorage, and inside this I created a shared folder named 2015OnSiteStorage, so far so good.
The umount /mnt/2015OnSiteStorage or unmount /mnt2015OffSiteStorage always gives the error message, the above names, not mounted according to mtab. I can see they are mounted, and if I open a shared file copy on my desktop and copy to it, it appears in the /mnt folder as well, so it's live and synced. This is confusing enough.
But here is the problem I am writing to ask you about.
I manually move both of the above named files to trash so they are NOT in /mnt, and the /mnt folder is empty.
OK, I Mount the folder 2015OnSiteStorage, then use GRSync to copy files to it. They appear in this folder on the expected computer. But here is the surprise. I open the external hard drive on the remote computer down at the house, and there is the 2015OnSiteStorage folder, using up almost all of the space on that hard drive, since it was a large file, and this is only a 200 gig drive.
How can RSync write to a folder not named in the path?
Since I can't unmount, I moved the 2015OnSiteStorage folder from /mnt to TRASH.
When I ran GRSync to copy the files I place in 2015OffSiteStorage, I found it also copied the files to my 1 terrabyte drive. These computers have DIFFERENT IP address, and I use the IP address to Mount the folders.
I called my IT guy and he said what is happening is totally impossible.
I gave him a copy of how I have GRSync set up, ever button turned on or off.
And the information from several codes he had me type in Terminal.
I was looking on-line and found several folks who have had this problem in the past.
GRSync or RSync writing to ALL of their mounted shared folders, even when they are unmounted.
Do you have any idea of what's going on here?
Yes I know I should be using NSF for the Linux to Linux, but must use CIFS for Windows machine.
I've tried using the / behind file names, this just produces a second folder inside the first.
This is what I use in Terminal to mount the remote hard drives over the LAN to /mnt.
sudo mount -t cifs //192.168.1.12/2015OffSiteMirror /mnt -o username=gary,file_mode=0777,dir_mode=0777
sudo mount -t cifs //192.168.1.14/2015OnSiteMirror /mnt -o username=gary,file_mode=0777,dir_mode=0777
They mount just fine, but as I said, won't unmount using umount.
umount /mnt/2015OnSiteMirror Always says it is not mounted by mtab.
My Destination in GRSync is /mnt/2015OnSiteMirror/
or the alternate session /mnt/2014OffSiteMirror/
It does not matter if I manually remove to TRASH the folder in /mnt I don't want written to, it still gets written to.
HELP!
Thanks Yogi
TTUL
Gary
This has to do with GRsync, Mount, & Umount.
A brief history. I was never able to PUT a file from my computer to a Remote Computer. The only way we could get my data on a Remote Computer was to use the Remote Computer to FETCH the data.
E.g. Computer A (my office) TO Computer B (in house) always failed for some reason. But Computer B FROM Computer A always worked without errors.
OK - An IT guy stopped by my house the other day and got everything working just fine. Sorta.
This is from my Linux computers in the office, using GRsync to an external Hard Drive connected to my wife's Windows XP computer in the house. This part was working fine, and still is working, but I have created a new problem.
To start with, in order to get GRsync to work, naturally you have to mount the remote Hard Drive.
Mounting the shared folder from the external hard drive worked like a charm.
The shared folder, 2015OffSiteStorage appeared in the /mnt folder on my computer, so was accessible to GRsync and synced perfectly with the external drive at the house.
I also use GRSync to copy to my local USB hard drive, I've never had a problem with anything local.
HERE is where my new problems began.
I have a 1 terrabyte drive IN a computer up here in my office, using only about 100 gigs for an OS partition. I created 750 gig partition I named DataStorage, and inside this I created a shared folder named 2015OnSiteStorage, so far so good.
The umount /mnt/2015OnSiteStorage or unmount /mnt2015OffSiteStorage always gives the error message, the above names, not mounted according to mtab. I can see they are mounted, and if I open a shared file copy on my desktop and copy to it, it appears in the /mnt folder as well, so it's live and synced. This is confusing enough.
But here is the problem I am writing to ask you about.
I manually move both of the above named files to trash so they are NOT in /mnt, and the /mnt folder is empty.
OK, I Mount the folder 2015OnSiteStorage, then use GRSync to copy files to it. They appear in this folder on the expected computer. But here is the surprise. I open the external hard drive on the remote computer down at the house, and there is the 2015OnSiteStorage folder, using up almost all of the space on that hard drive, since it was a large file, and this is only a 200 gig drive.
How can RSync write to a folder not named in the path?
Since I can't unmount, I moved the 2015OnSiteStorage folder from /mnt to TRASH.
When I ran GRSync to copy the files I place in 2015OffSiteStorage, I found it also copied the files to my 1 terrabyte drive. These computers have DIFFERENT IP address, and I use the IP address to Mount the folders.
I called my IT guy and he said what is happening is totally impossible.
I gave him a copy of how I have GRSync set up, ever button turned on or off.
And the information from several codes he had me type in Terminal.
I was looking on-line and found several folks who have had this problem in the past.
GRSync or RSync writing to ALL of their mounted shared folders, even when they are unmounted.
Do you have any idea of what's going on here?
Yes I know I should be using NSF for the Linux to Linux, but must use CIFS for Windows machine.
I've tried using the / behind file names, this just produces a second folder inside the first.
This is what I use in Terminal to mount the remote hard drives over the LAN to /mnt.
sudo mount -t cifs //192.168.1.12/2015OffSiteMirror /mnt -o username=gary,file_mode=0777,dir_mode=0777
sudo mount -t cifs //192.168.1.14/2015OnSiteMirror /mnt -o username=gary,file_mode=0777,dir_mode=0777
They mount just fine, but as I said, won't unmount using umount.
umount /mnt/2015OnSiteMirror Always says it is not mounted by mtab.
My Destination in GRSync is /mnt/2015OnSiteMirror/
or the alternate session /mnt/2014OffSiteMirror/
It does not matter if I manually remove to TRASH the folder in /mnt I don't want written to, it still gets written to.
HELP!
Thanks Yogi
TTUL
Gary
Re: Computer Question for Yogi
Hello Gary
The short answer to your question is that I have no idea why you are seeing what you see. I truly appreciate all the confidence you put into my experiences with Linux, but this is one area I've not had to deal with to any great extent.
The file transfers you are trying to perform are very complex in that you are crossing over a few different platforms to accomplish the task. None of it is beyond Linux as the success you are experiencing demonstrates. However, you obviously are encountering a bug that could be difficult to exterminate. My instincts would tell me to simplify the setup and to try and get all the device technologies on approximately the same level. For example I'm not sure how well Windows XP can play with the computer on which you have an external terabyte drive. Aside from any possible technical incompatibilities, there is a lot of mounting going on that may not be tracking well across platforms. I'm sure that some configuration table(s) somewhere is confusing the operation. Finding where it is and then fixing the right entries could be like looking for the proverbial needle in a haystack.
Instead of rsync or grsync have you considered an alternative backup scheme? Would running something like SAMBA from a cron do what you want to do? Doing that would require knowledge of cron coding, but maybe your IT buddy has some experience he can lend to you. Also, do you really have to use Windows for the mirror? Could you not just use another Linux machine local to your garage setup? Doing that would simplify the file shares and you would not be asking Windows to play nice with Linux. My guess is that you already thought of some alternatives and have good reason to attempt what you are currently doing. I'm afraid I don't know enough to be of any real help here.
The short answer to your question is that I have no idea why you are seeing what you see. I truly appreciate all the confidence you put into my experiences with Linux, but this is one area I've not had to deal with to any great extent.
The file transfers you are trying to perform are very complex in that you are crossing over a few different platforms to accomplish the task. None of it is beyond Linux as the success you are experiencing demonstrates. However, you obviously are encountering a bug that could be difficult to exterminate. My instincts would tell me to simplify the setup and to try and get all the device technologies on approximately the same level. For example I'm not sure how well Windows XP can play with the computer on which you have an external terabyte drive. Aside from any possible technical incompatibilities, there is a lot of mounting going on that may not be tracking well across platforms. I'm sure that some configuration table(s) somewhere is confusing the operation. Finding where it is and then fixing the right entries could be like looking for the proverbial needle in a haystack.
Instead of rsync or grsync have you considered an alternative backup scheme? Would running something like SAMBA from a cron do what you want to do? Doing that would require knowledge of cron coding, but maybe your IT buddy has some experience he can lend to you. Also, do you really have to use Windows for the mirror? Could you not just use another Linux machine local to your garage setup? Doing that would simplify the file shares and you would not be asking Windows to play nice with Linux. My guess is that you already thought of some alternatives and have good reason to attempt what you are currently doing. I'm afraid I don't know enough to be of any real help here.
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
Thanks Yogi
The IT guy figured out the problem and returned late Saturday night to try it out.
My Mirrors all work perfectly now, and we have them automated now too.
Normally one would use SSH to Windows and NSF to Linux, but since all the computers share data with each other, we chose cifs (which is Samba, but newer than using smbfs).
He solved my first problem straight off, fixing it so I could PUT data, rather than having to FETCH data. This was simply a program usage error on my part. One which I never found the answer to on-line.
He's a UNIX Mainframe IT guy for Elavon, and for some things they use RedHat, and Windows 7 on the floor. He uses Ubuntu, and has played with Mint, but never delved into straight Debian before.
Turns out, what he said is totally impossible, is actually quite normal. Folder can be Symlinked together so whatever you do in one appears in all of them. You can do this on Windows also, which is how Debi lost all of her photo file backups. Only a link back to her own drive was saved on the backup drive. So when her hard drive crashed, she lost all of her photos she thought was backed up.
In essence, that is what was happening here.
I'm sure you know this already, but you create a mountpoint (usually /media or /mnt) on your own computer and a folder by the same name of the one on the remote computer. This is only temporary storage where RSync writes the data to. It does not mean it went anywhere else yet.
You have to mount the folder on the remote computer on your computer, so the data does transfer. This creates a Symlink between your folder named for e.g. backup, and the folder named backup on the remote computer. Whatever is written to the folder on your computer appears on the folder on the remote computer, but is actually placed in the remote folder, NOT just a link back to your computer.
You can unmount the mounted drive and go read the folder on the remote computer and all the data is there, as it should be.
Here is where I was confused, and what caused all the folders to appear on all the machines.
The Symlinks are NOT to the shared folder, but to the Mount Point. I was using /mnt and placing a folder from each machine into it. Even though the folders had different names, they all appeared on all the computers.
If I placed a document in a folder, it propagated to all the computers, even when the folders were not mounted, this is what the guy was saying is impossible.
But further study of Symlink shows it was doing exactly what it was supposed to do. How, without being mounted, is what confused him, UNTIL we found "multiple-mounts" being stored for the Symlinks and they were all live.
Also, I was placing documents in the Symlinked mount point folder, not in the folder using Network. Why that matters I still don't know.
In any case, he solved the problem easily, once he figured out what was happening.
RedHat, Ubuntu, and Mint, probably others, install with a code turned off, Debian installs with the code turned on.
To make sure it never happens again. Like trying to put 200 gigs of data on a 25 gig partition. We created a separate mount point for each computer on my computer. He placed those in /fstab so they get mounted automatically on boot up. The mount umount does not work from GRSync for some reason, so the other locations are always mounted on my computer.
I've not yet experimented with writing a file from each computer to the Off Site Storage, to see if things like the delete on destination option, or adding different folders causes them to appear everywhere again.
I have mounted all the folders on each computer to see if that brings back the symlink problem between all computers and it did not.
I've not messed with my external hard drive I call File Server, and probably won't. The one I manually carry from the house to the garage office to make a mirror of for safe keeping.
But now that I finally have GRSync working properly, and learned a whole lot more from this guy than I could learn on my own. I may set up a new File Server down at the house, but it won't be a NAS this time.
I have an old 512 gig computer, which is too slow for anything other than doing accounting work on. I may place it down at the house with a huge hard drive in it, or an external, as my Off Site Storage.
As an aside: I know it is better to use the ext3 or ext4 for everything. However, if I want my data available to my heirs when I die, I have to save it all on NTFS and not use characters in file names not recognized by Windows. This is one reason I've been using external drives in place of the NAS which the lightning killed. NTFS does not handle time stamps very well, which causes several files to be rewritten each time RSync runs, but not enough it causes a problem.
OK, I'm rambling. I'm elated, after all these years, I can finally PUT data where I want it, hi hi...
I've also written down everything I needed to do to make it all work, so if I have to build a new computer and install everything, I know how to set the GRSync back up again, hi hi... I forget easily.
Man Pages are totally useless to me, because they do not give examples.
Have a Great Day Yogi, and thanks!
PS: Also learned why umount never worked right for me as well. Again, because man pages do not give examples of how to format the instructions.
TTUL
Gary
The IT guy figured out the problem and returned late Saturday night to try it out.
My Mirrors all work perfectly now, and we have them automated now too.
Normally one would use SSH to Windows and NSF to Linux, but since all the computers share data with each other, we chose cifs (which is Samba, but newer than using smbfs).
He solved my first problem straight off, fixing it so I could PUT data, rather than having to FETCH data. This was simply a program usage error on my part. One which I never found the answer to on-line.
He's a UNIX Mainframe IT guy for Elavon, and for some things they use RedHat, and Windows 7 on the floor. He uses Ubuntu, and has played with Mint, but never delved into straight Debian before.
Turns out, what he said is totally impossible, is actually quite normal. Folder can be Symlinked together so whatever you do in one appears in all of them. You can do this on Windows also, which is how Debi lost all of her photo file backups. Only a link back to her own drive was saved on the backup drive. So when her hard drive crashed, she lost all of her photos she thought was backed up.
In essence, that is what was happening here.
I'm sure you know this already, but you create a mountpoint (usually /media or /mnt) on your own computer and a folder by the same name of the one on the remote computer. This is only temporary storage where RSync writes the data to. It does not mean it went anywhere else yet.
You have to mount the folder on the remote computer on your computer, so the data does transfer. This creates a Symlink between your folder named for e.g. backup, and the folder named backup on the remote computer. Whatever is written to the folder on your computer appears on the folder on the remote computer, but is actually placed in the remote folder, NOT just a link back to your computer.
You can unmount the mounted drive and go read the folder on the remote computer and all the data is there, as it should be.
Here is where I was confused, and what caused all the folders to appear on all the machines.
The Symlinks are NOT to the shared folder, but to the Mount Point. I was using /mnt and placing a folder from each machine into it. Even though the folders had different names, they all appeared on all the computers.
If I placed a document in a folder, it propagated to all the computers, even when the folders were not mounted, this is what the guy was saying is impossible.
But further study of Symlink shows it was doing exactly what it was supposed to do. How, without being mounted, is what confused him, UNTIL we found "multiple-mounts" being stored for the Symlinks and they were all live.
Also, I was placing documents in the Symlinked mount point folder, not in the folder using Network. Why that matters I still don't know.
In any case, he solved the problem easily, once he figured out what was happening.
RedHat, Ubuntu, and Mint, probably others, install with a code turned off, Debian installs with the code turned on.
To make sure it never happens again. Like trying to put 200 gigs of data on a 25 gig partition. We created a separate mount point for each computer on my computer. He placed those in /fstab so they get mounted automatically on boot up. The mount umount does not work from GRSync for some reason, so the other locations are always mounted on my computer.
I've not yet experimented with writing a file from each computer to the Off Site Storage, to see if things like the delete on destination option, or adding different folders causes them to appear everywhere again.
I have mounted all the folders on each computer to see if that brings back the symlink problem between all computers and it did not.
I've not messed with my external hard drive I call File Server, and probably won't. The one I manually carry from the house to the garage office to make a mirror of for safe keeping.
But now that I finally have GRSync working properly, and learned a whole lot more from this guy than I could learn on my own. I may set up a new File Server down at the house, but it won't be a NAS this time.
I have an old 512 gig computer, which is too slow for anything other than doing accounting work on. I may place it down at the house with a huge hard drive in it, or an external, as my Off Site Storage.
As an aside: I know it is better to use the ext3 or ext4 for everything. However, if I want my data available to my heirs when I die, I have to save it all on NTFS and not use characters in file names not recognized by Windows. This is one reason I've been using external drives in place of the NAS which the lightning killed. NTFS does not handle time stamps very well, which causes several files to be rewritten each time RSync runs, but not enough it causes a problem.
OK, I'm rambling. I'm elated, after all these years, I can finally PUT data where I want it, hi hi...
I've also written down everything I needed to do to make it all work, so if I have to build a new computer and install everything, I know how to set the GRSync back up again, hi hi... I forget easily.
Man Pages are totally useless to me, because they do not give examples.
Have a Great Day Yogi, and thanks!
PS: Also learned why umount never worked right for me as well. Again, because man pages do not give examples of how to format the instructions.
TTUL
Gary
Re: Computer Question for Yogi
Glad to hear your system is working as you hoped it would. I think we both learned something here. Thank you for providing all that detail regarding the solution to your problem. It brings back vague memories of the days I played the role of system administrator and had to know how to do such things. Obviously I didn't do a lot of it, or I would have been more helpful to you.
Unix man pages are pretty well structured and explain everything you ever wanted to know, except why you would need all the optional parameters. A few examples would be helpful, but then there are so many different platforms that rely on the same documentation that it would be impossible to cover all the variations in examples.
The problem with instruction manuals is that they assume you have prior knowledge. The same expectation exists when it comes to using computer technology on all levels. All you get in user manuals is what the buttons do, but they never explain why you would want to do it.
Unix man pages are pretty well structured and explain everything you ever wanted to know, except why you would need all the optional parameters. A few examples would be helpful, but then there are so many different platforms that rely on the same documentation that it would be impossible to cover all the variations in examples.
The problem with instruction manuals is that they assume you have prior knowledge. The same expectation exists when it comes to using computer technology on all levels. All you get in user manuals is what the buttons do, but they never explain why you would want to do it.
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
Hi Yogi
Man pages do seem to tell you quite a bit about each command, but I rarely find one that tells you how to form the command.
Take something really simple used in an RSync command, like EXCLUDE...
I looked on-line and found numerous examples where folks used it, but it never worked for me.
I found things like
--exclude=$HOME/.thumbnails
--exclude=*/.cache
--exclude= .thumbnails
But the man pages never said anything about the form to use other than this:
--exclude=PATTERN exclude files matching PATTERN
--exclude-from=FILE read exclude patterns from FILE
and the larger version:
--exclude=PATTERN
This option is a simplified form of the --filter option that
defaults to an exclude rule and does not allow the full
rule-parsing syntax of normal filter rules.
See the FILTER RULES section for detailed information on this
option.
--exclude-from=FILE
This option is related to the --exclude option, but it specifies
a FILE that contains exclude patterns (one per line). Blank
lines in the file and lines starting with ’;’ or ’#’ are
ignored. If FILE is -, the list will be read from standard
input.
And here are the Filter Rules mentioned:
FILTER RULES
The filter rules allow for flexible selection of which files to trans‐
fer (include) and which files to skip (exclude). The rules either
directly specify include/exclude patterns or they specify a way to
acquire more include/exclude patterns (e.g. to read them from a file).
As the list of files/directories to transfer is built, rsync checks
each name to be transferred against the list of include/exclude pat‐
terns in turn, and the first matching pattern is acted on: if it is an
exclude pattern, then that file is skipped; if it is an include pattern
then that filename is not skipped; if no matching pattern is found,
then the filename is not skipped.
Rsync builds an ordered list of filter rules as specified on the com‐
mand-line. Filter rules have the following syntax:
RULE [PATTERN_OR_FILENAME]
RULE,MODIFIERS [PATTERN_OR_FILENAME]
You have your choice of using either short or long RULE names, as
described below. If you use a short-named rule, the ’,’ separating the
RULE from the MODIFIERS is optional. The PATTERN or FILENAME that fol‐
lows (when present) must come after either a single space or an under‐
score (_). Here are the available rule prefixes:
exclude, - specifies an exclude pattern.
include, + specifies an include pattern.
merge, . specifies a merge-file to read for more rules.
dir-merge, : specifies a per-directory merge-file.
hide, H specifies a pattern for hiding files from the transfer.
show, S files that match the pattern are not hidden.
protect, P specifies a pattern for protecting files from dele‐
tion.
risk, R files that match the pattern are not protected.
clear, ! clears the current include/exclude list (takes no arg)
When rules are being read from a file, empty lines are ignored, as are
comment lines that start with a "#".
Note that the --include/--exclude command-line options do not allow the
full range of rule parsing as described above -- they only allow the
specification of include/exclude patterns plus a "!" token to clear the
list (and the normal comment parsing when rules are read from a file).
If a pattern does not begin with "- " (dash, space) or "+ " (plus,
space), then the rule will be interpreted as if "+ " (for an include
option) or "- " (for an exclude option) were prefixed to the string. A
--filter option, on the other hand, must always contain either a short
or long rule name at the start of the rule.
Note also that the --filter, --include, and --exclude options take one
rule/pattern each. To add multiple ones, you can repeat the options on
the command-line, use the merge-file syntax of the --filter option, or
the --include-from/--exclude-from options.
Although they do show the syntax, it may not work that way, if you are working with sub-folders.
If I'm syncing a single folder which contains thumbnails I could use --exclude=.thumbnails and it works.
But if I'm saving everything in a tree, I have to write it using the entire tree name.
E.g. --exclude=/home/gary/.thumbnails.
Sorry for making this so darn long, but I do have one more question I cannot figure out.
In GRSync under EXTRA OPTIONS it has boxes to run a command before or after Rsync.
"Execute this command before running Rsync"
"Click this item if you want to run a command before starting Rsync. Can be useful, for instance, to mount a file system before running Rsync."
In other words, GRSync clearly tells you what this box is for, such as mounting a file system.
However, using the proper command to mount a file system from command line does not work if placed here. Nor does the simple umount command after you run GRSync.
I've not found any bug reports related to this, and no one saying it doesn't work.
There must be something I don't know. The GRSync home page only tells you to view RSync for instructions.
Since all of my destinations are now mounted using fstab it really don't matter, but still bugs me just the same that I can't do it.
Thanks Yogi
TTUL
Gary
Man pages do seem to tell you quite a bit about each command, but I rarely find one that tells you how to form the command.
Take something really simple used in an RSync command, like EXCLUDE...
I looked on-line and found numerous examples where folks used it, but it never worked for me.
I found things like
--exclude=$HOME/.thumbnails
--exclude=*/.cache
--exclude= .thumbnails
But the man pages never said anything about the form to use other than this:
--exclude=PATTERN exclude files matching PATTERN
--exclude-from=FILE read exclude patterns from FILE
and the larger version:
--exclude=PATTERN
This option is a simplified form of the --filter option that
defaults to an exclude rule and does not allow the full
rule-parsing syntax of normal filter rules.
See the FILTER RULES section for detailed information on this
option.
--exclude-from=FILE
This option is related to the --exclude option, but it specifies
a FILE that contains exclude patterns (one per line). Blank
lines in the file and lines starting with ’;’ or ’#’ are
ignored. If FILE is -, the list will be read from standard
input.
And here are the Filter Rules mentioned:
FILTER RULES
The filter rules allow for flexible selection of which files to trans‐
fer (include) and which files to skip (exclude). The rules either
directly specify include/exclude patterns or they specify a way to
acquire more include/exclude patterns (e.g. to read them from a file).
As the list of files/directories to transfer is built, rsync checks
each name to be transferred against the list of include/exclude pat‐
terns in turn, and the first matching pattern is acted on: if it is an
exclude pattern, then that file is skipped; if it is an include pattern
then that filename is not skipped; if no matching pattern is found,
then the filename is not skipped.
Rsync builds an ordered list of filter rules as specified on the com‐
mand-line. Filter rules have the following syntax:
RULE [PATTERN_OR_FILENAME]
RULE,MODIFIERS [PATTERN_OR_FILENAME]
You have your choice of using either short or long RULE names, as
described below. If you use a short-named rule, the ’,’ separating the
RULE from the MODIFIERS is optional. The PATTERN or FILENAME that fol‐
lows (when present) must come after either a single space or an under‐
score (_). Here are the available rule prefixes:
exclude, - specifies an exclude pattern.
include, + specifies an include pattern.
merge, . specifies a merge-file to read for more rules.
dir-merge, : specifies a per-directory merge-file.
hide, H specifies a pattern for hiding files from the transfer.
show, S files that match the pattern are not hidden.
protect, P specifies a pattern for protecting files from dele‐
tion.
risk, R files that match the pattern are not protected.
clear, ! clears the current include/exclude list (takes no arg)
When rules are being read from a file, empty lines are ignored, as are
comment lines that start with a "#".
Note that the --include/--exclude command-line options do not allow the
full range of rule parsing as described above -- they only allow the
specification of include/exclude patterns plus a "!" token to clear the
list (and the normal comment parsing when rules are read from a file).
If a pattern does not begin with "- " (dash, space) or "+ " (plus,
space), then the rule will be interpreted as if "+ " (for an include
option) or "- " (for an exclude option) were prefixed to the string. A
--filter option, on the other hand, must always contain either a short
or long rule name at the start of the rule.
Note also that the --filter, --include, and --exclude options take one
rule/pattern each. To add multiple ones, you can repeat the options on
the command-line, use the merge-file syntax of the --filter option, or
the --include-from/--exclude-from options.
Although they do show the syntax, it may not work that way, if you are working with sub-folders.
If I'm syncing a single folder which contains thumbnails I could use --exclude=.thumbnails and it works.
But if I'm saving everything in a tree, I have to write it using the entire tree name.
E.g. --exclude=/home/gary/.thumbnails.
Sorry for making this so darn long, but I do have one more question I cannot figure out.
In GRSync under EXTRA OPTIONS it has boxes to run a command before or after Rsync.
"Execute this command before running Rsync"
"Click this item if you want to run a command before starting Rsync. Can be useful, for instance, to mount a file system before running Rsync."
In other words, GRSync clearly tells you what this box is for, such as mounting a file system.
However, using the proper command to mount a file system from command line does not work if placed here. Nor does the simple umount command after you run GRSync.
I've not found any bug reports related to this, and no one saying it doesn't work.
There must be something I don't know. The GRSync home page only tells you to view RSync for instructions.
Since all of my destinations are now mounted using fstab it really don't matter, but still bugs me just the same that I can't do it.
Thanks Yogi
TTUL
Gary
Re: Computer Question for Yogi
I've never had reason to use Grsync, but I can tell you some of the obvious things you may already know.
Grsync does exactly what rsync does but provides a GUI in which to do it. Executing rsync from the command line could be a tedious exercise in trial and error if you are not intimately familiar with all that information from the man pages. Trial and error is not a bad way to learn, but as I said earlier, it often requires some prior knowledge to accomplish the goal.
The "extra options" tab in Grsync has a checkbox to run the command as root. Are you doing that?
I'm not sure what you are trying to do before or after running rsync, but is it reasonable to expect the mount command issued from Grsync to work if your destination is already mounted via fstab? Likewise, unmounting might require permissions that Grsync does not have at the time you are trying to use it. Then too, does some sort of delay need to occur before the rsync part of the command is run. You probably would have to state that delay time in your command somehow.
The version Grsync I have in one of my virtual machines has an option to do a trial run. You can see the results of the trial in a window opened by Grsync. Any error messages would be listed there, and perhaps reviewing what the error is will give you some idea of what needs to be done to successfully execute your optional commands.
I see what Grsync is doing, but I do not know how it is doing it. Are the before and after commands formed correctly by the program? Can you run the commands manually, i.e. before...: rsync... : after... : ? Is the Grsync output command syntax the same as what you can run successfully from the command line? I'd love to be able to tell you how to look at that, but I don't know how. I would be surprised if there is not an error log that you can review to see why Grsync isn't doing what it should. Check that log for enlightenment.
Grsync does exactly what rsync does but provides a GUI in which to do it. Executing rsync from the command line could be a tedious exercise in trial and error if you are not intimately familiar with all that information from the man pages. Trial and error is not a bad way to learn, but as I said earlier, it often requires some prior knowledge to accomplish the goal.
The "extra options" tab in Grsync has a checkbox to run the command as root. Are you doing that?
I'm not sure what you are trying to do before or after running rsync, but is it reasonable to expect the mount command issued from Grsync to work if your destination is already mounted via fstab? Likewise, unmounting might require permissions that Grsync does not have at the time you are trying to use it. Then too, does some sort of delay need to occur before the rsync part of the command is run. You probably would have to state that delay time in your command somehow.
The version Grsync I have in one of my virtual machines has an option to do a trial run. You can see the results of the trial in a window opened by Grsync. Any error messages would be listed there, and perhaps reviewing what the error is will give you some idea of what needs to be done to successfully execute your optional commands.
I see what Grsync is doing, but I do not know how it is doing it. Are the before and after commands formed correctly by the program? Can you run the commands manually, i.e. before...: rsync... : after... : ? Is the Grsync output command syntax the same as what you can run successfully from the command line? I'd love to be able to tell you how to look at that, but I don't know how. I would be surprised if there is not an error log that you can review to see why Grsync isn't doing what it should. Check that log for enlightenment.
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
Hi Yogi
Yes I have an error log, which shows no errors.
Since I am now using fstab to mount the folders, I do not need it in GRSync.
It just bugged me that it did not work properly in GRSync is all.
Making a Trial Run produces no error messages about many things, including this.
Probably because it is writing only the on-board folder and not attempting a write to a remote folder.
Using GRSync to Mount the folder does work, but with a "Failed to Mount" error message.
Then after it runs, it closes with another error "Unable to Umount."
It does mount the folder, syncs the data, but does not unmount the folder.
Yes, I must have the "Run as Root" for it to mount.
Using RSync from command line has the exact same problem, when trying to mount, run RSync, then unmount a drive or folder.
I played around with using a separate instruction file which RSync reads. But this is only good if you want to sync only certain folders from a list of many.
It's no big deal really, since I now leave the drives mounted all the time, and they automatically mount when I restart the computer.
I was thinking there might be some special code which needed to be used in the command line as placed in the box in GRSync that I didn't know about.
Here is the line I use on the command line to open one of the remote drives:
sudo mount -t cifs //192.168.1.12/OffSiteStorage /MountDebiBioStar -o username=gary,password=Xxxxxx,iocharset=utf8,file_mode=0777,dir_mode=0777
For the above line, out of four others I use for other machines. Because this one does require "Run as Root" I have to type in my password to run as root.
I figured out why I have to run as root on this one, but not on the other four. The other four use the ext4 files system through Linux computers. The above is Debi's Windows XP computer to an external hard drive as NTFS format. When I do a LOCAL Rsync to an NTFS drive, I do not need to run as root.
In any case, we now have everything running automatically, which saves me from having to do it.
A Cron Job now runs things, just like before my server died. At 1am everything is synced with my local drive, at 3am the local drive is synced to the drive down in the house.
Plus, if I do change some important files, I can open GRSync and run the session to the local drive, and the off-site drive at the touch of a button.
Don't laugh, but this is how I got myself into hot water the last time. It ran for so many years without my touching it, I forgot how I set it all up in the first place.
THIS TIME I have written every step down, including all the dependencies it needs to work right.
This does not mean I'll be able to do it again if the computers programming changes again and some of the rules change.
Since my IP addresses are not static, I may have to go through everything and change those around some time after a power outage, hi hi... Or like last time, when I bought a new router.
I'm not going to mess with making an .sh file I think it is, to run mounts and RSync that way. I've not learned enough about Bash to keep myself out of trouble.
TTUL
Gary
Yes I have an error log, which shows no errors.
Since I am now using fstab to mount the folders, I do not need it in GRSync.
It just bugged me that it did not work properly in GRSync is all.
Making a Trial Run produces no error messages about many things, including this.
Probably because it is writing only the on-board folder and not attempting a write to a remote folder.
Using GRSync to Mount the folder does work, but with a "Failed to Mount" error message.
Then after it runs, it closes with another error "Unable to Umount."
It does mount the folder, syncs the data, but does not unmount the folder.
Yes, I must have the "Run as Root" for it to mount.
Using RSync from command line has the exact same problem, when trying to mount, run RSync, then unmount a drive or folder.
I played around with using a separate instruction file which RSync reads. But this is only good if you want to sync only certain folders from a list of many.
It's no big deal really, since I now leave the drives mounted all the time, and they automatically mount when I restart the computer.
I was thinking there might be some special code which needed to be used in the command line as placed in the box in GRSync that I didn't know about.
Here is the line I use on the command line to open one of the remote drives:
sudo mount -t cifs //192.168.1.12/OffSiteStorage /MountDebiBioStar -o username=gary,password=Xxxxxx,iocharset=utf8,file_mode=0777,dir_mode=0777
For the above line, out of four others I use for other machines. Because this one does require "Run as Root" I have to type in my password to run as root.
I figured out why I have to run as root on this one, but not on the other four. The other four use the ext4 files system through Linux computers. The above is Debi's Windows XP computer to an external hard drive as NTFS format. When I do a LOCAL Rsync to an NTFS drive, I do not need to run as root.
In any case, we now have everything running automatically, which saves me from having to do it.
A Cron Job now runs things, just like before my server died. At 1am everything is synced with my local drive, at 3am the local drive is synced to the drive down in the house.
Plus, if I do change some important files, I can open GRSync and run the session to the local drive, and the off-site drive at the touch of a button.
Don't laugh, but this is how I got myself into hot water the last time. It ran for so many years without my touching it, I forgot how I set it all up in the first place.
THIS TIME I have written every step down, including all the dependencies it needs to work right.
This does not mean I'll be able to do it again if the computers programming changes again and some of the rules change.
Since my IP addresses are not static, I may have to go through everything and change those around some time after a power outage, hi hi... Or like last time, when I bought a new router.
I'm not going to mess with making an .sh file I think it is, to run mounts and RSync that way. I've not learned enough about Bash to keep myself out of trouble.
TTUL
Gary
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
[split] Important Notice
Thanks for the vote of confidence Yogi...
Remember, you are talking to the guy who when he started to study php, did not know why his lessons did not work. The book forgot to mention it only works if you have your computer set up as a server.
I had to learn how to do that, and never quite got it right either.
Speaking of home network, although the network now works A-OK, it does have a couple of strange problems from time to time. Which may be related to something I've not yet figured out.
I work from a copy of my file server external drive. Ignore the words file server to avoid confusion.
I keep all of my data on an external drive, which I now keep unplugged and turned off.
A copy of this data is stored on an internal drive on another computer.
I access this data through the Samba Network, which is a faux mount. This drive is also Mounted on my computer so RSync can mirror the changes to the external drive when I turn it on to run RSync.
If I work directly from the external drive, formatted as NTFS so it is readable by plugging into a Windows computer, all works fine, including time stamps.
The copy of this external drive is on a Linux computer formatted as EXT4.
While working through the LAN Samba faux mounting the Shared Folder on my desktop, every so often when I go to open a certain folder, the shared drive disappears from the Samba mount.
After this happened a few times, I began opening it going through the directory tree directly to the Mounted drive. Have never had it suddenly close on me doing it this way.
Must be a bug in Samba somewhere?
I still have one major thing bugging the heck out of me though.
On this computers internal drive, I have a partition I named Data Partition, it is where I keep the items I work on almost every day. Plus several other files I might need during the week. This keeps me from having to go through the LAN or to the External Drive.
You know me and my redundant backups, hi hi...
I copy this EXT4 Data Partition from my computer, using RSync, to:
1) Computer #2 formatted EXT4 via LAN
2) The External Drive formatted NTFS via USB
3) An External Drive connected to a Remote Computer down at the house, formatted NTFS via LAN.
Here is what I don't understand at all.
Running RSync to 1) Computer #2 works perfectly.
Running RSync to 3) Remote Computer, I must be Root.
As an experiment, I installed the same External Drive from down at the house, to 1) Computer #2, up here in the office. it still goes through the same LAN, through a computer to the External Drive. RSync works flawlessly without being Root. Works the same as Root.
Bring the same drive back down to the house, and the only way to save to it is being Root. Plus I get this one error, NOT a WARNING in Red, about it not doing something, Oh yes, it says it is not mounted, but it is and the data still copies to it just fine.
I do know about the FAT32 so do use --modify-window=1 GRSync uses --modify-window=NUM, but I also added --modify-window=2 which cut out some rewrites of existing unchanged existing files. But even so, I have several files that are unchanged, which still are copied each time I run either GRSync or RSync.
To keep this from happening on a few larger folders which will never get changed. I copied them outside of the main folder I backup, and made sure it was found on all of my redundant backups in the same location.
Nevertheless, Rsync should not be backing up unchanged files I've stored for years.
FWIW: Using the --exclude command will probably also keep it from copying each time, but then using --delete and running a reverse situation may delete it, so it is best I keep it separate.
I still say what the Linux world says about a Mounted File being the same as being on your own computer is false. You can write to the Mounted File and it is only stored in the temporary folder you named the same as your actual Mounted Folder. If the Link Fails, the data does not go to the remote location at all. It just stays as temporary and give a false sense of security.
This is why I always double check my other file locations using either Samba or I go directly to the computer and load it from there. Remember, my frau lost several years of her photographs due to a bad backup program that only saved a LINK back to the hard drive she was backing up.
When I checked the Windows backup drive, all the files appeared to be there. But when her hard drive failed, all I found on the backup drive were dead Links back to the main hard drive. This was from a popular Windows bought and paid for backup program. Needless to say, they got an earful from me.
Ever since then, I've used External Drives and checked them on a different computer to make sure the files were actually there.
OK, off my soapbox.
No need to comment on the above, as I've already tried almost all the information I've gleaned from various websites to no avail.
I should say one thing I tried did work for one folder, but not the other. I made a new folder and copied the contents to the new folder, then deleted the old folder. This stopped Nautilus from crashing.
Doing the same on another folder did not help one bit. If I try opening it through Network Share, it crashes every time. Studying the log files do not show a problem. The one file gave me some word that started with a P, like procurement or something. That file works now that I made a new one, so I don't see that P word in my log file anymore.
Have a great day Yogi!
TTUL
Gary
Remember, you are talking to the guy who when he started to study php, did not know why his lessons did not work. The book forgot to mention it only works if you have your computer set up as a server.
I had to learn how to do that, and never quite got it right either.
Speaking of home network, although the network now works A-OK, it does have a couple of strange problems from time to time. Which may be related to something I've not yet figured out.
I work from a copy of my file server external drive. Ignore the words file server to avoid confusion.
I keep all of my data on an external drive, which I now keep unplugged and turned off.
A copy of this data is stored on an internal drive on another computer.
I access this data through the Samba Network, which is a faux mount. This drive is also Mounted on my computer so RSync can mirror the changes to the external drive when I turn it on to run RSync.
If I work directly from the external drive, formatted as NTFS so it is readable by plugging into a Windows computer, all works fine, including time stamps.
The copy of this external drive is on a Linux computer formatted as EXT4.
While working through the LAN Samba faux mounting the Shared Folder on my desktop, every so often when I go to open a certain folder, the shared drive disappears from the Samba mount.
After this happened a few times, I began opening it going through the directory tree directly to the Mounted drive. Have never had it suddenly close on me doing it this way.
Must be a bug in Samba somewhere?
I still have one major thing bugging the heck out of me though.
On this computers internal drive, I have a partition I named Data Partition, it is where I keep the items I work on almost every day. Plus several other files I might need during the week. This keeps me from having to go through the LAN or to the External Drive.
You know me and my redundant backups, hi hi...
I copy this EXT4 Data Partition from my computer, using RSync, to:
1) Computer #2 formatted EXT4 via LAN
2) The External Drive formatted NTFS via USB
3) An External Drive connected to a Remote Computer down at the house, formatted NTFS via LAN.
Here is what I don't understand at all.
Running RSync to 1) Computer #2 works perfectly.
Running RSync to 3) Remote Computer, I must be Root.
As an experiment, I installed the same External Drive from down at the house, to 1) Computer #2, up here in the office. it still goes through the same LAN, through a computer to the External Drive. RSync works flawlessly without being Root. Works the same as Root.
Bring the same drive back down to the house, and the only way to save to it is being Root. Plus I get this one error, NOT a WARNING in Red, about it not doing something, Oh yes, it says it is not mounted, but it is and the data still copies to it just fine.
I do know about the FAT32 so do use --modify-window=1 GRSync uses --modify-window=NUM, but I also added --modify-window=2 which cut out some rewrites of existing unchanged existing files. But even so, I have several files that are unchanged, which still are copied each time I run either GRSync or RSync.
To keep this from happening on a few larger folders which will never get changed. I copied them outside of the main folder I backup, and made sure it was found on all of my redundant backups in the same location.
Nevertheless, Rsync should not be backing up unchanged files I've stored for years.
FWIW: Using the --exclude command will probably also keep it from copying each time, but then using --delete and running a reverse situation may delete it, so it is best I keep it separate.
I still say what the Linux world says about a Mounted File being the same as being on your own computer is false. You can write to the Mounted File and it is only stored in the temporary folder you named the same as your actual Mounted Folder. If the Link Fails, the data does not go to the remote location at all. It just stays as temporary and give a false sense of security.
This is why I always double check my other file locations using either Samba or I go directly to the computer and load it from there. Remember, my frau lost several years of her photographs due to a bad backup program that only saved a LINK back to the hard drive she was backing up.
When I checked the Windows backup drive, all the files appeared to be there. But when her hard drive failed, all I found on the backup drive were dead Links back to the main hard drive. This was from a popular Windows bought and paid for backup program. Needless to say, they got an earful from me.
Ever since then, I've used External Drives and checked them on a different computer to make sure the files were actually there.
OK, off my soapbox.
No need to comment on the above, as I've already tried almost all the information I've gleaned from various websites to no avail.
I should say one thing I tried did work for one folder, but not the other. I made a new folder and copied the contents to the new folder, then deleted the old folder. This stopped Nautilus from crashing.
Doing the same on another folder did not help one bit. If I try opening it through Network Share, it crashes every time. Studying the log files do not show a problem. The one file gave me some word that started with a P, like procurement or something. That file works now that I made a new one, so I don't see that P word in my log file anymore.
Have a great day Yogi!
TTUL
Gary
Re: Computer Question for Yogi
I have a general idea of what you are doing there, but the details are overwhelming. There is plenty of opportunity for errors. It seems that you are mixing file systems and juggling mount points beyond what even Linux can easily understand. LOL It boils down to having the correct permissions and keeping track of mount points. When you move around the hardware for your experiments, are you certain that you are un-mounting devices correctly? Are you able to verify that the contents of /etc/fstab and /etc/mtab match reality? You yourself admit to having some doubts about how to execute rsync and mount commands properly, and that in itself could be causing problems.
It is prudent to have multiple backups of important data. It might not be as simple as I see it, but I would look for ways to access all your backup devices easily. For example, many routers can be set up as controllers for external hard drives attached to them. The same goes for a good NAS box. By accessing devices and file systems via routers and/or NAS you could eliminate the spider web of connections you need to keep track of. Access to the router is a given for your LAN to work, and setting up NAS as a file server is what it is made for. Many NAS devices can also be expanded by adding external drives to them. Assuming everything is on the same LAN, all your workstations can easily access router storage and NAS storage no matter how many devices you attach to them. The advantage of doing it that way is that you don't have to keep track of permissions or mount points. The (acting) servers do it for you. There are a ton of backup programs out there that you can use this way with confidence, or you can write your own cron. As far as mirroring your workstation in real time goes, I think you have that figured out already.
It is prudent to have multiple backups of important data. It might not be as simple as I see it, but I would look for ways to access all your backup devices easily. For example, many routers can be set up as controllers for external hard drives attached to them. The same goes for a good NAS box. By accessing devices and file systems via routers and/or NAS you could eliminate the spider web of connections you need to keep track of. Access to the router is a given for your LAN to work, and setting up NAS as a file server is what it is made for. Many NAS devices can also be expanded by adding external drives to them. Assuming everything is on the same LAN, all your workstations can easily access router storage and NAS storage no matter how many devices you attach to them. The advantage of doing it that way is that you don't have to keep track of permissions or mount points. The (acting) servers do it for you. There are a ton of backup programs out there that you can use this way with confidence, or you can write your own cron. As far as mirroring your workstation in real time goes, I think you have that figured out already.
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
Looks like I did it again. Lost my post I was working on.
I explained with too much verbosity the reason I was currently working using so many places of file storage.
The greatly truncated version is simply, after the lightning destroyed my NAS, and getting a 1 terrabyte internal SATA HD in this rebuilt computer. I moved my stored data from three smaller hard drives to it. Cleaned up and resorted the data into a better arranged system of folders.
I cannot afford to replace the NAS and buy a backup drive to hold the data from it.
So I am making use of my pair of 500 gig hard drives for my two main folders. One for Data, and one as a Mirror for backup. Plus a pair of old 200 gig hard drives for my remaining folders.
In my message which got lost, (speaking of which, I thought this site saved messages as drafts, but never could find them to continue where I left off.) I reiterated how I kept two external drives in the office, and two matching drives down at the house, and physically carried them back and forth to make mirrored backups.
So, since I cannot afford anything larger right now. I take my on-board data and save changed folders to the 500 gig external, and the rest of the smaller folders to the 200 gig external, both formatted NTFS so they can be read by Windows computers. This part works A-OK as long as both drives are connected to THIS computers USB ports.
I only have problems if I try going through the LAN and connect them to other computers. Seems to work OK going from a Linux box to a Linux box, but not OK from a Linux box to a Windows box.
I've been doing a lot of experimenting by using both internal and external hard drives connected to different old computers. Saving the data as NTFS (not FAT32 drives) does not seem to cause any problems. So what I am thinking about doing is placing my super old 512 RAM machine down at the house and use it to connect the external hard drives too. I've used it up here with several old internal and external drives connected to it, and RSync works flawlessly each and every time.
The only time I have to redo my MOUNT points is when the Router is rebooted and assigns new IP numbers to the computers. I do not want to use static IP addresses, since I change computers around so often. I don't think my Router has USB ports on it. It might, I don't remember anymore.
A website told me to try using -c in my RSync line. I did that and instead of taking a couple of minutes, it took HOURS for the data to rewrite itself. It kept doing this each time, so I removed the -c.
I just don't understand why I can RSync to every computer without being ROOT, except for the Windows computer. To reach the hard drive on it, which mounts A-OK, in order to write to it, I must be ROOT.
Else I get permission denied on every file. Running as ROOT causes other problems too, which make no sense.
If I'm saving to an external USB connected drive, formatted as NTFS and do not have to be ROOT and get no errors. Or if I'm going through the LAN to the exact same drive, connected to another Linux computer, I do not have to be ROOT and get no errors. Why if I'm going to the exact same drive when connected to a Windows computer do I get nothing but permission denied errors. Using ROOT brings up other errors which can be annoying. I can MOUNT the remote drive. I can view and use the remote drive, save data to it without a problem. But if I try using RSync, it says the drive is not mounted, when it is.
And YES I know the difference between MOUNTING to a Mount Point, and having a Shared Mount via Nautilus. RSync CANNOT see a Nautilus mounted shared folder.
Nor does a folder need to be shared to be MOUNTED, it is two totally different ways of doing things.
OK, I'll quit bending your ear. It's a shame I lost my other message, as I gave short meaningful details of my current setup.
TTUL
Gary
I explained with too much verbosity the reason I was currently working using so many places of file storage.
The greatly truncated version is simply, after the lightning destroyed my NAS, and getting a 1 terrabyte internal SATA HD in this rebuilt computer. I moved my stored data from three smaller hard drives to it. Cleaned up and resorted the data into a better arranged system of folders.
I cannot afford to replace the NAS and buy a backup drive to hold the data from it.
So I am making use of my pair of 500 gig hard drives for my two main folders. One for Data, and one as a Mirror for backup. Plus a pair of old 200 gig hard drives for my remaining folders.
In my message which got lost, (speaking of which, I thought this site saved messages as drafts, but never could find them to continue where I left off.) I reiterated how I kept two external drives in the office, and two matching drives down at the house, and physically carried them back and forth to make mirrored backups.
So, since I cannot afford anything larger right now. I take my on-board data and save changed folders to the 500 gig external, and the rest of the smaller folders to the 200 gig external, both formatted NTFS so they can be read by Windows computers. This part works A-OK as long as both drives are connected to THIS computers USB ports.
I only have problems if I try going through the LAN and connect them to other computers. Seems to work OK going from a Linux box to a Linux box, but not OK from a Linux box to a Windows box.
I've been doing a lot of experimenting by using both internal and external hard drives connected to different old computers. Saving the data as NTFS (not FAT32 drives) does not seem to cause any problems. So what I am thinking about doing is placing my super old 512 RAM machine down at the house and use it to connect the external hard drives too. I've used it up here with several old internal and external drives connected to it, and RSync works flawlessly each and every time.
The only time I have to redo my MOUNT points is when the Router is rebooted and assigns new IP numbers to the computers. I do not want to use static IP addresses, since I change computers around so often. I don't think my Router has USB ports on it. It might, I don't remember anymore.
A website told me to try using -c in my RSync line. I did that and instead of taking a couple of minutes, it took HOURS for the data to rewrite itself. It kept doing this each time, so I removed the -c.
I just don't understand why I can RSync to every computer without being ROOT, except for the Windows computer. To reach the hard drive on it, which mounts A-OK, in order to write to it, I must be ROOT.
Else I get permission denied on every file. Running as ROOT causes other problems too, which make no sense.
If I'm saving to an external USB connected drive, formatted as NTFS and do not have to be ROOT and get no errors. Or if I'm going through the LAN to the exact same drive, connected to another Linux computer, I do not have to be ROOT and get no errors. Why if I'm going to the exact same drive when connected to a Windows computer do I get nothing but permission denied errors. Using ROOT brings up other errors which can be annoying. I can MOUNT the remote drive. I can view and use the remote drive, save data to it without a problem. But if I try using RSync, it says the drive is not mounted, when it is.
And YES I know the difference between MOUNTING to a Mount Point, and having a Shared Mount via Nautilus. RSync CANNOT see a Nautilus mounted shared folder.
Nor does a folder need to be shared to be MOUNTED, it is two totally different ways of doing things.
OK, I'll quit bending your ear. It's a shame I lost my other message, as I gave short meaningful details of my current setup.
TTUL
Gary
Re: Computer Question for Yogi
I don't know all the technical details, but you seem to be having difficulty negotiating local storage verses Windows Shares. They are two different animals. Anything local to your Linux box will have a transparent and seamless connection. Anything over your LAN will likely go through SAMBA and require the proper permissions/credentials for logging into the Windows Share. I'm not sure how rsync handles that but I do know you can attach login credentials to the command line.
rsync is an interesting beast when you read about how it works. Essentially there is a lot of handshaking going on checking checksums and all. rsync only writes blocks of data that are different and doing such writing on your Windows Share could be seen as a security issue by Windows. Thus permissions (root access) need to be properly established before rsync will work there.
Here on nBF, If you go to your User Control Panel, under the Overview tab, you will see a selection in which you may manage your stored draft copies. Unfortunately, the system does not save draft copies automatically. You physically must click the [Save draft] button in the full editor in order to preserve your reply for future editing. I have been fortunate on several occasions to simply go back in my browser history to recover replies that I thought I lost. Success there requires previewing your reply first so that it goes into a temporary browser cache. The bad news there is that when you turn off the browser, the temporary text disappears too. That's why saving drafts is a better idea.
rsync is an interesting beast when you read about how it works. Essentially there is a lot of handshaking going on checking checksums and all. rsync only writes blocks of data that are different and doing such writing on your Windows Share could be seen as a security issue by Windows. Thus permissions (root access) need to be properly established before rsync will work there.
Here on nBF, If you go to your User Control Panel, under the Overview tab, you will see a selection in which you may manage your stored draft copies. Unfortunately, the system does not save draft copies automatically. You physically must click the [Save draft] button in the full editor in order to preserve your reply for future editing. I have been fortunate on several occasions to simply go back in my browser history to recover replies that I thought I lost. Success there requires previewing your reply first so that it goes into a temporary browser cache. The bad news there is that when you turn off the browser, the temporary text disappears too. That's why saving drafts is a better idea.
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
OK, trying again, third time I've lost this message.
I had said I have a UPS down at the house to keep my modem and router hot, so the IP addresses don't change when I power up a computer. I quit using UPS units on my computers because every one I've ever owned will for no logical reason, cycle causing my computers to reboot.
I think I figured out what is causing me to lose my message. When I went to type the letter "q" and accidentally hit the tab button, when I hit the backspace, this window disappears.
So, whatever I've written up to the time I hit Tab, then Backspace to clear the Tab, closes this window.
Yes, Network Samba Mount is NOT the same system as a true Mounted Drive or Folder.
This is why RSync cannot see a shared folder. Shared folders can only be seen using Nautilus, they are mounted in Nautilus by Nautilus using Samba. RSync is a totally separate system.
My mind has just drawn a blank, but I don't think you have to Share a folder in order to Mount it the right way on Linux, but you do on Windows machines?
Naturally, I have to Share a Windows folder to view it through the Network, same with Linux.
But right this moment, I can't remember if I had to share a folder in order to mount it for Linux.
Seems like I wouldn't have to, since I think it uses SSH or NFS or something else other than Samba.
I just checked my Synaptic Package Manager, I DO NOT have Samba installed on this Computer. Yet using the Network Icon, I can view shared folders from any computer, including the windows box. But I cannot share a folder on this computer, because I do not have Samba installed.
Now I'm confusing myself. Let me check a few things so I can say something that is not gibberish.
TTUL
Gary
I had said I have a UPS down at the house to keep my modem and router hot, so the IP addresses don't change when I power up a computer. I quit using UPS units on my computers because every one I've ever owned will for no logical reason, cycle causing my computers to reboot.
I think I figured out what is causing me to lose my message. When I went to type the letter "q" and accidentally hit the tab button, when I hit the backspace, this window disappears.
So, whatever I've written up to the time I hit Tab, then Backspace to clear the Tab, closes this window.
Yes, Network Samba Mount is NOT the same system as a true Mounted Drive or Folder.
This is why RSync cannot see a shared folder. Shared folders can only be seen using Nautilus, they are mounted in Nautilus by Nautilus using Samba. RSync is a totally separate system.
My mind has just drawn a blank, but I don't think you have to Share a folder in order to Mount it the right way on Linux, but you do on Windows machines?
Naturally, I have to Share a Windows folder to view it through the Network, same with Linux.
But right this moment, I can't remember if I had to share a folder in order to mount it for Linux.
Seems like I wouldn't have to, since I think it uses SSH or NFS or something else other than Samba.
I just checked my Synaptic Package Manager, I DO NOT have Samba installed on this Computer. Yet using the Network Icon, I can view shared folders from any computer, including the windows box. But I cannot share a folder on this computer, because I do not have Samba installed.
Now I'm confusing myself. Let me check a few things so I can say something that is not gibberish.
TTUL
Gary
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
Just jumped back to say, YES, in order to MOUNT a folder, the folder MUST be Shared, both on Linux and Windows. Although network sharing is handled by a totally different system, what it does is make the folder PUBLIC so it can be seen by the Root system and Mounted.
All of my Linux machines with Shared Folders on them, does have Samba installed.
This computer I have not shared any folders, so only have Samba-Common and Winbind installed.
If I try to share a folder, it says I must install Samba first.
All of my MOUNTPOINT folders are placed in ROOT. The Permissions for all Mounted folders is RW RW RW. Both Linux and Windows folders. Yet I have to be ROOT to write to the Windows folder using RSync.
I'll study this a bit more and maybe find the reason.
None of my mounted folders appear in my /home directory in the /.gvfs folder, which seems to be normal for Debian, but not for Ubuntu or Mint.
Have a great day!
TTUL
Gary
All of my Linux machines with Shared Folders on them, does have Samba installed.
This computer I have not shared any folders, so only have Samba-Common and Winbind installed.
If I try to share a folder, it says I must install Samba first.
All of my MOUNTPOINT folders are placed in ROOT. The Permissions for all Mounted folders is RW RW RW. Both Linux and Windows folders. Yet I have to be ROOT to write to the Windows folder using RSync.
I'll study this a bit more and maybe find the reason.
None of my mounted folders appear in my /home directory in the /.gvfs folder, which seems to be normal for Debian, but not for Ubuntu or Mint.
Have a great day!
TTUL
Gary
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
WooHoo, ALL FIXED!!!
I knew if I surfed enough I would eventually stumble across a suggestion that works.
Turns out, although my MountPoint folder had the proper permissions, AFTER I mount a drive connected through a windows machine, the permissions get changed back to ROOT, thus the reason I had to RUN AS ROOT to that drive.
Even working as ROOT I cannot change the permissions of a remote drive on a windows machine.
Using in my mount command (file_mode=0777,dir_mode=0777) had no affect at changing it from ROOT.
However, adding (uid=me,gid=me) did the trick.
Now using RSync without using RUN AS ROOT works perfectly, the same as all the other drives.
Wish I had the time to have figured all this out five years ago. Would have saved hours of using the old Copy and Paste way of doing things.
Your different prompting questions got me to look back at things I had forgotten about, and as such, led me to hunt in the right areas for the problem. So even if you don't think you helped, you did!
So a BIG Thank You for putting up with this olde geezer, hi hi...
TTUL
Gary
I knew if I surfed enough I would eventually stumble across a suggestion that works.
Turns out, although my MountPoint folder had the proper permissions, AFTER I mount a drive connected through a windows machine, the permissions get changed back to ROOT, thus the reason I had to RUN AS ROOT to that drive.
Even working as ROOT I cannot change the permissions of a remote drive on a windows machine.
Using in my mount command (file_mode=0777,dir_mode=0777) had no affect at changing it from ROOT.
However, adding (uid=me,gid=me) did the trick.
Now using RSync without using RUN AS ROOT works perfectly, the same as all the other drives.
Wish I had the time to have figured all this out five years ago. Would have saved hours of using the old Copy and Paste way of doing things.
Your different prompting questions got me to look back at things I had forgotten about, and as such, led me to hunt in the right areas for the problem. So even if you don't think you helped, you did!
So a BIG Thank You for putting up with this olde geezer, hi hi...
TTUL
Gary
Re: Computer Question for Yogi
http://sociusba.com/wp-content/uploads/ ... oheads.jpg
The moral of the story is that two heads are indeed better than one. Good job on finding a solution. I may not have all the answers, but I generally know where to look.
The moral of the story is that two heads are indeed better than one. Good job on finding a solution. I may not have all the answers, but I generally know where to look.
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
Agreed!
Being down here in east podunk all alone, I don't have anyone to bounce things off of. Trying to use the various websites associated with a topic, usually brings about tons of more questions not relevant to the problem. Or you get debased pretty hard for not knowing the prerequisites or get bashed for using the beginners forum, when your icon shows seven year member, hi hi.
So I just stay out of them now.
I'm bad at remembering things, even when I write down every step I took, because I did it wrong, or it don't work that way anymore. I did the same thing with my current problem I finally solved, went through all of my notes and rewrote them, step by step, for when I need to set up another computer or file I need to mount.
This computer has a graphics card so new it is not supported in Linux. I can and did download the driver for it from the maker of the card, it worked perfectly with the driver. Then I accidentally allowed a kernel upgrade and ended up with the black screen of death, and the newer GRUB does not show the previous kernel versions to roll back to them.
My only alternative was to reload the OS from scratch and start over. I spent days trying to undo the mess before doing so, to no avail. It was not that hard to install the graphic driver when I first did it, but for the life of me, I cannot figure out how I installed it in the kernel, and the program that does it for you, doesn't work for me now.
No biggie, I'm sticking with the automatically installed driver. It is the wrong one, and if I force it to use the secondary default driver, I have the same problems. It just bugs me I don't remember how I did it.
That, plus I need to find out how to get GRUB to show my previous kernels so I can fall back on them. It does on my other computers, but not on this one. I'm not schmartz enough to figure these things out easily. But given enough time, if I can find time to work on them, I eventually do.
Thanks for all your help Yogi, it is much appreciated. Get's me to thinking more clearly about stuff too!
TTUL
Gary
Being down here in east podunk all alone, I don't have anyone to bounce things off of. Trying to use the various websites associated with a topic, usually brings about tons of more questions not relevant to the problem. Or you get debased pretty hard for not knowing the prerequisites or get bashed for using the beginners forum, when your icon shows seven year member, hi hi.
So I just stay out of them now.
I'm bad at remembering things, even when I write down every step I took, because I did it wrong, or it don't work that way anymore. I did the same thing with my current problem I finally solved, went through all of my notes and rewrote them, step by step, for when I need to set up another computer or file I need to mount.
This computer has a graphics card so new it is not supported in Linux. I can and did download the driver for it from the maker of the card, it worked perfectly with the driver. Then I accidentally allowed a kernel upgrade and ended up with the black screen of death, and the newer GRUB does not show the previous kernel versions to roll back to them.
My only alternative was to reload the OS from scratch and start over. I spent days trying to undo the mess before doing so, to no avail. It was not that hard to install the graphic driver when I first did it, but for the life of me, I cannot figure out how I installed it in the kernel, and the program that does it for you, doesn't work for me now.
No biggie, I'm sticking with the automatically installed driver. It is the wrong one, and if I force it to use the secondary default driver, I have the same problems. It just bugs me I don't remember how I did it.
That, plus I need to find out how to get GRUB to show my previous kernels so I can fall back on them. It does on my other computers, but not on this one. I'm not schmartz enough to figure these things out easily. But given enough time, if I can find time to work on them, I eventually do.
Thanks for all your help Yogi, it is much appreciated. Get's me to thinking more clearly about stuff too!
TTUL
Gary
Re: Computer Question for Yogi
You and I are battling exactly opposite problems. There are too many options on my grub menus.
If you look into the /boot directory you will see a list of all the available kernels on your machine. But, you probably knew that already. You can use this to determine what your current kernel is:
I would not attempt to do that manually because it is insanely complicated. There is a package called Grub Customizer that is available for Ubuntu. It may work with other distros as well. It provides a GUI along with the proper scripts to make modifying grub a simple task. I love it because I can change grub fonts and background graphics so that I can tell easily which distro and device I am running.
Then there is boot-repair-disk which you can download and make into a live CD. This will erase your current grub and replace it with a standard version that so far has never failed me. The only problem is that it lists all the OS's on your machine and all the boot options associated with them. You may find that part the most appealing, but to me it's a nuisance.
My nVidia video card gave me all kinds of hell when I tried to use it in Ubuntu. I ended up learning how to install it from the Linux shell. All the instructions I need are written in a step by step fashion and I no longer have to remember how to do it. Then I discovered that Ubuntu does not support the latest from nVidia but they do have a stock nVidia driver in their repositories. It gets updated from time to time and works flawlessly. Since I'm not an avid gamer I don't miss the esoteric options available on the card but not accessible via the standard driver. However, if I want to, I can install the latest and greatest by brute force.
If you look into the /boot directory you will see a list of all the available kernels on your machine. But, you probably knew that already. You can use this to determine what your current kernel is:
- uname -r
- dpkg -l | grep linux-image-
I would not attempt to do that manually because it is insanely complicated. There is a package called Grub Customizer that is available for Ubuntu. It may work with other distros as well. It provides a GUI along with the proper scripts to make modifying grub a simple task. I love it because I can change grub fonts and background graphics so that I can tell easily which distro and device I am running.
Then there is boot-repair-disk which you can download and make into a live CD. This will erase your current grub and replace it with a standard version that so far has never failed me. The only problem is that it lists all the OS's on your machine and all the boot options associated with them. You may find that part the most appealing, but to me it's a nuisance.
My nVidia video card gave me all kinds of hell when I tried to use it in Ubuntu. I ended up learning how to install it from the Linux shell. All the instructions I need are written in a step by step fashion and I no longer have to remember how to do it. Then I discovered that Ubuntu does not support the latest from nVidia but they do have a stock nVidia driver in their repositories. It gets updated from time to time and works flawlessly. Since I'm not an avid gamer I don't miss the esoteric options available on the card but not accessible via the standard driver. However, if I want to, I can install the latest and greatest by brute force.
- Kellemora
- Brainiac Class Poster
- Posts: 3389
- Joined: 07 Jul 2012, 18:52
- Location: From St. Louis, current Knoxville, TN
- Contact:
Re: Computer Question for Yogi
I could not get into my system, even from a live CD, it was really messed up big time. And of course, way over my head too.
I do have several repair disks and use them to do things all the time. The old GRUB was easy to work with, and set how you want, the new GRUB2 or whatever they call it is not so easy.
I did figure out why one computer showed older kernels and the newer installs don't. During installation I selected the package maintainers version. This is good in a way, as I don't get all those updates you mentioned. My old repaired computer did not have the package maintainers version, so every little update appears for installation. These can sometimes be two, three or more per week, like you said, a royal pain, but at least they load quick. The package maintainers version stores up the updates until a serious one comes along, then does them all at once. I get these about once every other month is all.
When I boot up, I get a message about needing to set something to use their suggested video driver, which is a very wrong one to use. I did set it once and then I had no video at all. But in this case, it was just a matter of turning the switch off and not use their suggestion. Whatever the other default currently is, it works, but does have some minor problems. Not enough to mess with installing the correct driver again and possibly messing up the system by an accidental kernel upgrade like I did last time.
Because you can install the entire OS and its programs using Virtual Box, and then save the whole machine as a single file, I tried that on the used machine I purchased. The idea of only having to copy the machine file back to fix everything seemed like a great idea, and fast too.
After I installed the OS, got all the settings the way I wanted them, and installed the programs I used most often, changed them so they had the extensions, and plug-ins, and everything was working perfectly, I saved the copy to my external drive.
Learned the hard way that to reinstall, you have to create a new machine using the original machine name, then overwrite the file with the saved copy. It didn't work, but I learned why too, had to go in and change UUID and a few other things on the new machine before copying the old one back. Still didn't work.
I started over from scratch again, using different settings when I set up virtual box. After all was set up, I made a copy of the file, but have not tried reloading it yet. I hit another problem on the second install. Everything slowed down to a crawl. I rebooted a couple of times and if I work not in virtual box, things run normally, but in virtual box everything is slo-mo. It wasn't like this my original install, so I think it has to do with some of the installation settings I changed.
Have not had time to mess with it again for several months, but I still like the idea of simply copying a single file to restore the OS and all of its programs.
Have a great day Yogi.
Come tomorrow morning and I'll be back at my job working from 8am till 4pm or later. Burning up all of my writing time. I'll jump back on my breaks as usual.
TTUL
Gary
I do have several repair disks and use them to do things all the time. The old GRUB was easy to work with, and set how you want, the new GRUB2 or whatever they call it is not so easy.
I did figure out why one computer showed older kernels and the newer installs don't. During installation I selected the package maintainers version. This is good in a way, as I don't get all those updates you mentioned. My old repaired computer did not have the package maintainers version, so every little update appears for installation. These can sometimes be two, three or more per week, like you said, a royal pain, but at least they load quick. The package maintainers version stores up the updates until a serious one comes along, then does them all at once. I get these about once every other month is all.
When I boot up, I get a message about needing to set something to use their suggested video driver, which is a very wrong one to use. I did set it once and then I had no video at all. But in this case, it was just a matter of turning the switch off and not use their suggestion. Whatever the other default currently is, it works, but does have some minor problems. Not enough to mess with installing the correct driver again and possibly messing up the system by an accidental kernel upgrade like I did last time.
Because you can install the entire OS and its programs using Virtual Box, and then save the whole machine as a single file, I tried that on the used machine I purchased. The idea of only having to copy the machine file back to fix everything seemed like a great idea, and fast too.
After I installed the OS, got all the settings the way I wanted them, and installed the programs I used most often, changed them so they had the extensions, and plug-ins, and everything was working perfectly, I saved the copy to my external drive.
Learned the hard way that to reinstall, you have to create a new machine using the original machine name, then overwrite the file with the saved copy. It didn't work, but I learned why too, had to go in and change UUID and a few other things on the new machine before copying the old one back. Still didn't work.
I started over from scratch again, using different settings when I set up virtual box. After all was set up, I made a copy of the file, but have not tried reloading it yet. I hit another problem on the second install. Everything slowed down to a crawl. I rebooted a couple of times and if I work not in virtual box, things run normally, but in virtual box everything is slo-mo. It wasn't like this my original install, so I think it has to do with some of the installation settings I changed.
Have not had time to mess with it again for several months, but I still like the idea of simply copying a single file to restore the OS and all of its programs.
Have a great day Yogi.
Come tomorrow morning and I'll be back at my job working from 8am till 4pm or later. Burning up all of my writing time. I'll jump back on my breaks as usual.
TTUL
Gary