You are not logged in.
Pages: 1
Topic closed
Okay so I have this problem with invalid characters. For ex. I have an external hdd that i have used on windows and on debian/ubuntu and some of my folders contains åäö, but i get a "?" . And it is the same with gnome terminal but there i can change the charecter encoding to unicode(UTF-8) and then i works fine, but it keeps jumping back to the old charecter settings.
In rc.conf i use "en_US.UTF8" and have tried "en_US"
(Sorry for any miss spelling)
Offline
Typo, you have to spell the encoding _exactly_ as `locale -a` name it, namly en_US.utf8
Last edited by Mr.Elendig (2008-08-13 12:33:51)
Evil #archlinux@libera.chat channel op and general support dude.
. files on github, Screenshots, Random pics and the rest
Offline
thx but that did not help. still the same thing. I tried to crate a folder now with åäö and i got
"The name "åäö" is not valid. Please use a different name."
Offline
I asume that the external is fat/ntfs? If so, you probably have to set the encoding at mount time for the device.
Evil #archlinux@libera.chat channel op and general support dude.
. files on github, Screenshots, Random pics and the rest
Offline
Yes. it is Fat32 fomated. How do i do that? Do not know fstab that well
I did see now that i have the same problem with my locale hardrive that is ext3 formated
Last edited by cipp (2008-08-13 12:58:47)
Offline
Typo, you have to spell the encoding _exactly_ as `locale -a` name it, namly en_US.utf8
Yes, by the way I'm working on Debian (what a pain...) at the moment and I noticed the names of the locales were not the same. For example, where in Arch it's en_US.utf8, in Debian it's en_US.UTF-8. This caused a few bugs when I tried to use my Arch-based scripts...
Back on topic: use iocharset=utf8 as an option in fstab, like this for example:
/dev/sdxy /mnt/part vfat user,noauto,iocharset=utf8 0 0
Last edited by catwell (2008-08-13 13:02:34)
Offline
Yes, by the way I'm working on Debian (what a pain...) at the moment and I noticed the names of the locales were not the same. For example, where in Arch it's en_US.utf8, in Debian it's en_US.UTF-8. This caused a few bugs when I tried to use my Arch-based scripts...
/dev/sdxy /mnt/part vfat user,noauto,iocharset=utf8 0 0
I know, but since this is archlinux.org...
And I realy hate distroes that uses 'custom' names for the encoding, including that arch call it UTF-8 in locale.gen (not that that's arch's fault afaik)
Evil #archlinux@libera.chat channel op and general support dude.
. files on github, Screenshots, Random pics and the rest
Offline
#
# /etc/fstab: static file system information
#
# <file system> <dir> <type> <options> <dump> <pass>
none /dev/pts devpts defaults 0 0
none /dev/shm tmpfs defaults 0 0
/dev/disk /mnt/disk-1 vfat user,noauto,iocharset=utf8 0 0
/dev/cdrom /media/cdrom auto ro,user,noauto,unhide 0 0
/dev/dvd /media/dvd auto ro,user,noauto,unhide 0 0
/dev/sda2 swap swap defaults 0 0
UUID=c74b909d-0840-47ee-90eb-90f834d2640d / ext3 defaults 0 1
It looks like this and it is still not working. None of the drives i mount work with åäö.
Offline
try the 'utf8' mount option instead, and if that too fails, use 'uni_xlate'
Evil #archlinux@libera.chat channel op and general support dude.
. files on github, Screenshots, Random pics and the rest
Offline
I know this is an old post...
I'm using Arch with openbox and Thunar with the volume manager option. Any time I try to mount an external hdd or usb memory with the ntfs file system, I get the "invalid encoding" error. I can try to handle this with the mount command, but I would like to get it working without using the console, for every usb I try to mount with whatever file manager. Before using openbox, I have been using in the same system gnome without any problem. How can I do this? Thank you!
Offline
I know this is an old post...
...but? Open a new one then, or better search the wiki first.
http://wiki.archlinux.org/index.php/For … Bumping.27
Offline
Old thread, closed.
hokasch's link above explains everything.
Offline
Pages: 1
Topic closed