You are not logged in.
Pages: 1
Hi everybody,
I recently imaged my drive using Acronis and copied it over to a RAID-1 I had set up. I switched to using the RAID-1 as my primary disk and I ran into some issues with GRUB and /etc/fstab because they use the UUID, so I changed them to use /dev/sda3, which is the root (/) partition. It booted fine, but for some reason it wasn't using my updated /boot/grub/menu.list file and was still retaining the UUIDs in the menu.lst file it was using.
I decided to reinstall GRUB and set it up using the Arch Wiki again. However, when I did that I got the GRUB Error 15 issue. After trying several suggestions in forums and reading the wiki again, I'm now stuck with my install hanging at:
Booting...
GRUB Loading stage1.5
Here are my config files:
/etc/fstab:
#
# /etc/fstab: static file system information
#
# <file system> <dir> <type> <options> <dump> <pass>
devpts /dev/pts devpts defaults 0 0
shm /dev/shm tmpfs nodev,nosuid 0 0
/dev/cdrom /media/cd auto ro,user,noauto,unhide 0 0
#/dev/dvd /media/dvd auto ro,user,noauto,unhide 0 0
#/dev/fd0 /media/fl auto user,noauto 0 0
/dev/sda2 swap swap defaults 0 0
/dev/sda1 /boot ext2 defaults 0 1
/dev/sda3 / ext4 defaults 0 1
/dev/sda4 /home ext4 defaults,user_xattr 0 1
/boot/grub/device.map:
(fd0) /dev/fd0
(hd0) /dev/sda
(hd1) /dev/sdb
(hd2) /dev/sdc
/boot/grub/menu.lst:
# Config file for GRUB - The GNU GRand Unified Bootloader
# /boot/grub/menu.lst
# DEVICE NAME CONVERSIONS
#
# Linux Grub
# -------------------------
# /dev/fd0 (fd0)
# /dev/sda (hd0)
# /dev/sdb2 (hd1,1)
# /dev/sda3 (hd0,2)
#
# FRAMEBUFFER RESOLUTION SETTINGS
# +-------------------------------------------------+
# | 640x480 800x600 1024x768 1280x1024
# ----+--------------------------------------------
# 256 | 0x301=769 0x303=771 0x305=773 0x307=775
# 32K | 0x310=784 0x313=787 0x316=790 0x319=793
# 64K | 0x311=785 0x314=788 0x317=791 0x31A=794
# 16M | 0x312=786 0x315=789 0x318=792 0x31B=795
# +-------------------------------------------------+
# for more details and different resolutions see
# http://wiki.archlinux.org/index.php/GRUB#Framebuffer_Resolution
# general configuration:
timeout 5
default 0
color light-blue/black light-cyan/blue
# boot sections follow
# each is implicitly numbered from 0 in the order of appearance below
#
# TIP: If you want a 1024x768 framebuffer, add "vga=773" to your kernel line.
#
#-*
# (0) Arch Linux
title Arch Linux
root (hd0,0)
kernel /vmlinuz26 root=/dev/sda3 ro
initrd /kernel26.img
# (1) Arch Linux
title Arch Linux Fallback
root (hd0,0)
kernel /vmlinuz26 root=/dev/sda3 ro
initrd /kernel26-fallback.img
# (2) Windows
#title Windows
#rootnoverify (hd0,0)
#makeactive
#chainloader +1
When I installed GRUB, I ran the following commands
root (hd0,0)
setup (hd0)
Let me know where I went wrong :-)
Thanks!
Matt
Offline
A couple questions:
1) Is the RAID-1 setup a hardware RAID or software RAID?
2) Can you post the output from `fdisk -l` and specify which disk is your old primary disk, and which disk(s) is your new RAID?
Offline
1) Is the RAID-1 setup a hardware RAID or software RAID?
Hardware
2) Can you post the output from `fdisk -l` and specify which disk is your old primary disk, and which disk(s) is your new RAID?
fdisk -l:
Disk /dev/sda: 1000.2 GB, 1000204886016 bytes
255 heads, 63 sectors/track, 121601 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x1d4bc93c
Device Boot Start End Blocks Id System
/dev/sda1 * 1 17 136521 83 Linux
/dev/sda2 18 50 265072+ 82 Linux swap / Solaris
/dev/sda3 51 16193 129668647+ 83 Linux
/dev/sda4 16194 121601 846689760 83 Linux
Disk /dev/sdc: 808.9 GB, 808888614912 bytes
197 heads, 63 sectors/track, 127295 cylinders
Units = cylinders of 12411 * 512 = 6354432 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x5597177c
Device Boot Start End Blocks Id System
/dev/sdc1 1 127296 789929264 83 Linux
Disk /dev/sdb: 1000.2 GB, 1000204886016 bytes
255 heads, 63 sectors/track, 121601 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x1d4bc93c
Device Boot Start End Blocks Id System
/dev/sdb1 * 1 17 136521 83 Linux
/dev/sdb2 18 50 265072+ 82 Linux swap / Solaris
/dev/sdb3 51 16193 129668647+ 83 Linux
/dev/sdb4 16194 121601 846689760 83 Linux
Disk /dev/sdd: 1016 MB, 1016594432 bytes
255 heads, 63 sectors/track, 123 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x0217930c
Device Boot Start End Blocks Id System
/dev/sdd1 * 1 124 992608+ 6 FAT16
Partition 1 has different physical/logical endings:
phys=(122, 254, 63) logical=(123, 147, 24)
/dev/sda and /dev/sdb are the new RAID and /dev/sdc is the old drive
Matt
Offline
Hmm, are you sure it's true hardware RAID? What RAID card are you using? A true hw RAID will present a block device to the system, not two disks, as are showing up in the fdisk output (unless you're just using the card as JBOD).
My thinking is that Grub is having trouble finding stage 2 from stage 1.5, and that this is because stage 2 is on a partition that it can't recognize. If you are using true hw RAID, then this is quite strange; perhaps you should try reinstalling grub (with the root and setup lines) from a live cd (and chroot to the proper root, so that everything is found properly). If this is software RAID, then it might be that the boot partition has RAID metadata that grub doesn't recognize; you could try recreating the /boot partition with the "--metadata=0.90" option to mdadm --create.
Offline
Hmm, yea that's what I thought too. It's been a while since I've done RAID. I have an ASUS motherboard with a Nvidia RAID controller:
http://www.asus.com/Motherboards/AMD_AM … ifications
I've already tried reinstalling Grub multiple times by chrooting, so I'm not sure what to do next.
Matt
Offline
Yeah, I'm not sure how that NVidia controller would present devices.
As a troubleshooting step, you could try taking the boot partition out of the RAID, put an ext2 partition on it, reinstall grub, and see if that works. If so, then you know that the problem has to do with the RAID.
Also, instead of doing the root, setup lines to install, you could use the grub-install command. It does a bunch of checks, copies some files, and eventually runs root and setup (though with some extra options attached).
Offline
I tried grub-install with the same results.
I actually just reinstalled my whole Arch install onto the RAID and I have the same hang issue. Interestingly, if I unplug the second drive in my RAID-1 config, then it boots. Any suggestions?
Thanks!
Matt
Offline
I actually just reinstalled my whole Arch install onto the RAID and I have the same hang issue. Interestingly, if I unplug the second drive in my RAID-1 config, then it boots. Any suggestions?
Have you done the grub install on both /dev/sda and /dev/sdb? What is your BIOS' boot order? I'm thinking that your second drive is higher in the boot order than your first drive, but the second drive doesn't have a good grub installation (maybe with the wrong /boot location). Since you're getting the "Loading stage 1.5" message, you know that both stage1 is getting installed (in the MBR) and stage1.5 is getting installed (almost certainly in sectors 2-63 of the disk). So it's got to be that stage 1.5 is pointing to an incorrect (or unknown) location to load stage 2. And since this works if the second drive is pulled out, that would indicate that the two disks don't have the same data on them.
Unless, of course, something funky is going on with the Nvidia RAID.
Offline
Yea, I think it has to do with the order too. I checked the BIOS, and the RAID array was at the top under the list of Hard Drives, followed by the old boot drive I was using (my 3rd drive, not in the array, and now formatted to be a data partition). I pulled the data drive out and left the array plugged in (with both drives plugged back in) and now I get:
GRUB Hard Disk Error
Which makes me think that the 3rd drive is affecting it somehow, even though that drive was listed as /dev/sdc
Offline
Alright, for now I decided to revert back to my previous setup:
/dev/sda - single 800GB hdd on SATA0
/dev/sdb and /dev/sdc - 1 TB hdd RAID-1 setup on SATA2 and SATA3
Grub is booting fine off the single drive. I'm still a little mystified why /dev/sdc is showing up if it's a hardware RAID, but I guess I will just have to try unplugging a drive in the array and see if the data is still there.
Thanks graphene for all the help!
Matt
Offline
Yeah, I'm not convinced it's setup as an actual RAID (though most of my experience has been with mdadm and, independently, with LSI and 3ware cards that definitely present the RAID partitions, not the individual disks). Good luck!
Offline
Pages: 1