Skip to main content

mac osx - multiple port-based apache vhosts on osx 10.6 not resolving properly



I have a few local versions of development websites on my local mac, and want to provide access to them from a browser through vhosts, as well as using the live (web) versions from time to time.



I have read many examples of people doing similar things by changing the URL, and having apache listen to the unique URL to serve from a local location. I have always done it using the same URL, but a different port, and while it works seamlessly on windows, I can't get it working on the mac.



(Let's say) I have two websites:




  1. amazingwebsite.com


  2. facebookiller.org



I want to access the local versions by using the same URL, by enabling the browser's proxy (with one click) which I have set to 8080. apache is set to Listen *:8080 in httpd.conf.



In httpd-vhosts.conf (which is getting loaded) I have:



NameVirtualHost *:8080



ServerAdmin webmaster@amazingwebsite.com
ServerName amazingwebsite.com
ServerAlias www.amazingwebsite.com
DocumentRoot "/Users/username/Development/Projects/amazingwebsite"


Options Includes Indexes Multiviews
AllowOverride All
Order allow,deny
Allow from all






The 'facebookkiller.org' vhost is basically the same - just different local location.



My /private/etc/hosts is now set to:



##

# Host Database
#
# localhost is used to configure the loopback interface
# when the system is booting. Do not change this entry.
##
127.0.0.1 localhost
255.255.255.255 broadcasthost
::1 localhost
fe80::1%lo0 localhost


127.0.0.1 amazingwebsite.com
127.0.0.1 facebookkiller.org


and after an apache restart (web sharing off/on), apachectl -S reports:



VirtualHost configuration:
wildcard NameVirtualHosts and _default_ servers:
*:8080 is a NameVirtualHost
default server amazingwebsite.com (/private/etc/apache2/other/httpd-vhosts.conf:4)

port 8080 namevhost amazingwebsite.com (/private/etc/apache2/other/httpd-vhosts.conf:4)
port 8080 namevhost facebookkiller.org (/private/etc/apache2/other/httpd-vhosts.conf:19)
Syntax OK


which looks ok to me.



The behaviour?:





  • amazingwebsite.com:8080 => local installation (correct)

  • www.amazingwebsite.com:8080 => times out (incorrect)

  • amazingwebsite.com => quickly can't connect, browser can't find - (incorrect)

  • www.amazingwebsite.com => goes to web version (correct)

  • facebookkiller.org:8080 => local installation (correct)

  • www.facebookkiller.org:8080 => times out (incorrect)

  • facebookkiller.com => browser can't find (incorrect)

  • www.facebookkiller.com => goes to web version (correct)




So my ServerAlias isn't working, or there is something wrong with my hosts file - or both!



I've spent ages on this, and could really do with some help - thanks..


Answer



The vhosts are working, you haven't written your local /etc/hosts correctly. What's happening is that your timeout-then-working is because a 'www.' lookup is most likely resulting in a real DNS CNAME pointer back to non-www. version.



127.0.0.1   localhost google.com www.google.com amazon.com www.amazon.com


That's how you use a hosts file - one IP listed one time, all aliases on the same line and you must list all subdomains needed, www. e.g. that will send traffic to your local Apache instance.



Comments

Popular posts from this blog

linux - Awstats - outputting stats for merged Access_logs only producing stats for one server's log

I've been attempting this for two weeks and I've accessed countless number of sites on this issue and it seems there is something I'm not getting here and I'm at a lost. I manged to figure out how to merge logs from two servers together. (Taking care to only merge the matching domains together) The logs from the first server span from 15 Dec 2012 to 8 April 2014 The logs from the second server span from 2 Mar 2014 to 9 April 2014 I was able to successfully merge them using the logresolvemerge.pl script simply enermerating each log and > out_putting_it_to_file Looking at the two logs from each server the format seems exactly the same. The problem I'm having is producing the stats page for the logs. The command I've boiled it down to is /usr/share/awstats/tools/awstats_buildstaticpages.pl -configdir=/home/User/Documents/conf/ -config=example.com awstatsprog=/usr/share/awstats/wwwroot/cgi-bin/awstats.pl dir=/home/User/Documents/parced -month=all -year=all...

iLO 3 Firmware Update (HP Proliant DL380 G7)

The iLO web interface allows me to upload a .bin file ( Obtain the firmware image (.bin) file from the Online ROM Flash Component for HP Integrated Lights-Out. ) The iLO web interface redirects me to a page in the HP support website ( http://www.hp.com/go/iLO ) where I am supposed to find this .bin firmware, but no luck for me. The support website is a mess and very slow, badly categorized and generally unusable. Where can I find this .bin file? The only related link I am able to find asks me about my server operating system (what does this have to do with the iLO?!) and lets me download an .iso with no .bin file And also a related question: what is the latest iLO 3 version? (for Proliant DL380 G7, not sure if the iLO is tied to the server model)

hp proliant - Smart Array P822 with HBA Mode?

We get an HP DL360 G8 with an Smart Array P822 controller. On that controller will come a HP StorageWorks D2700 . Does anybody know, that it is possible to run the Smart Array P822 in HBA mode? I found only information about the P410i, who can run HBA. If this is not supported, what you think about the LSI 9207-8e controller? Will this fit good in that setup? The Hardware we get is used but all original from HP. The StorageWorks has 25 x 900 GB SAS 10K disks. Because the disks are not new I would like to use only 22 for raid6, and the rest for spare (I need to see if the disk count is optimal or not for zfs). It would be nice if I'm not stick to SAS in future. As OS I would like to install debian stretch with zfs 0.71 as file system and software raid. I have see that hp has an page for debian to. I would like to use hba mode because it is recommend, that zfs know at most as possible about the disk, and I'm independent from the raid controller. For us zfs have many benefits, ...