SELinux ACLs with Apache

A quick reminder to myself (and you if you’ve come across my little site) to change SELinux file ACLs when uploading new files to be served by Apache (httpd) on Centos.

Yesterday I linked to some Radeon drivers in my post.

However, the linked zip file was showing ‘Access Denied’ errors, despite the correct filesystem permissions.

I had forgotten to also mark the file as something httpd should have access to on Centos as far as SELinux was concerned.

Without further ado, it simply took:

sudo chcon -v -t httpd_sys_content_t uploaded_file.ext


ATI Radeon Mobility X1400 on Windows 10

I’m getting an old Dell Inspiron E1505 upgraded to Windows 10.

Most things are going well, but there are definitely some driver difficulties, the 1st of which are the graphics.

This comes via GreenReaper on the Microsoft forums, but I wanted to replicate it here just in case the post goes away:

(If you trust me, you can skip this whole part and simply download tis  zipped Windows 10 Radeon X1400 Win 10 driver)

First, download the driver from the Microsoft Update catalog: the link fails, search
For: ATI Technologies Inc. – Display – ATI Mobility Radeon X1400

Open the downloaded cab file, and copy the contents to a new folder.
Create *another* new folder within that folder called: B_72960
Copy all of the files into that folder as well (or it will error)

Reboot into Safe Mode
Click Start->Power
Hold down the Shift key and click Reboot
Click Troubleshoot->Advanced Options->Startup Settings->RebootChoose “Safe Mode” on reboot

When you’ve booted into safe mode, open Windows Explorer

Right click “This PC” and click “Manage”
Click Device ManagerClick the arrow next to “Display Adapters”
Right Click “Basic Display Adapter” (or whatever it says)
Click Update Driver Software
Click Browser my computer for driver software
Choose the folder with the downloaded driver
After the installation, reboot.

Then voi la! Enjoy your accelerated graphics and the beauty of Windows 10! :)

Some Fun With NFS and Windows

I have some Linux servers that I’d like to talk to my Windows Server 2012R2 file server.

Since I’d like daemons, rather than users, to be able to communicate with the server, I thought this would be a good candidate for NFS.

Linux Side (1st round)

(I’m using Centos, but the general concept will apply to Fedora, Ubuntu, etc.)

Install the daemons that will access the file server. Most of these will create their own users.

Create any additional users you would like to be able to access the file server. You can always add more later.

To save some complexity (and not assume you pay for Active Directory), I’m not going to have my file server look up Linux IDs via Active Directory. Instead, I’m going to use flat passwd and group files, just like Linux.

Copy (via SSH, USB, copy/paste, whatever) the passwd and group files from /etc/ over to your Windows server.

You can delete all of the entries for users/groups that will not be accessing the share.

Window Side

Copy the the passwd and group files to:

Create users (and groups) on your server with the same user name / group name as you created on your Linux server.

The passwd and group files serve as a map between the user/group IDs in Linux and the user/group names in Windows.

Install Server for NFS on the Windows server.

Server Manager->Manage->Add Roles and Features

server-for-nfsNext->Next->etc. until installed.

Browse to the folder on your file server you are looking to share.

Right click on it and choose Properties

Go to the NFS Sharing tab

Click the “Manage NFS Sharing” button


Check the “Share this folder” check box.

The only other change I make here is to uncheck the “Enable unmapped user access” option so that only users in the passwd file we copied over will have access to the server.

Next, click on the Permissions button at the bottom


I like to set “All Machines” to be no access, that way only the servers I specify will be able to mount the share.

Click the “Add…” button.

add-nfs-clientIn the “Add Names:” box, enter the IP address of your Linux server.

Make sure Type of Access is set to the type you are looking for.

I prefer to leave “Allow root access” unchecked for a bit more security.

Press OK, OK, Close

If everything worked, the folder icon should now look like this:


Using the security tab, assign NTFS permission to the folder for the users you would like to be able to read/write to that folder, just as you would if it were an SMB share.

Linux Side (2nd Round)

Install the NFS client and enable (make start on boot) and start the services.

sudo yum -y install nfs-utils

sudo systemctl enable rpcbind
sudo systemctl enable nfs-server
sudo systemctl enable nfs-lock
sudo systemctl enable nfs-idmap

sudo systemctl start rpcbind
sudo systemctl start nfs-server
sudo systemctl start nfs-lock
sudo systemctl start nfs-idmap

Create a folder that will be used as the mount point for the file server, aka: Where do I go to get to the files on the file server.

I was really hoping to find a definitive “this is where to mount nfs shares” article, but some Binging around came up with nothing.

I will therefore advise you create a folder under /mnt, as that feels right to me.

sudo mkdir -p /mnt/[server name]/[share name]

It’s finally time to give the share a test.


sudo mount -t nfs [server name or ip]:/[nfs share name] /mnt/[server name]/[share name]

Make sure you are logged in as a user with permission to that folder and cd into it:

cd /mnt/[server name]/[share name]

You should now be able to create files and folders! (which will of course be visible on the file server as well)

The final step is to have the server automatically mount the share on boot.

sudo nano /etc/fstab

Add a line similar to:

[server dns name or ip]:/[share name]    /mnt/[file server name]/[share]  nfs     defaults        0 0

Give the server a reboot to test automatic mounting

sudo shutdown -r now

When you reboot, the share should be mounted and all is good in the world!

PS: If you are using this for transmission-daemon (which I’m assuming you’re using for legitimate purposes), make sure you edit your settings.json file and set umask=0, otherwise transmission will create folders that it cannot create files in.

New UI Prefab Scaling Solution

I recently changed the settings on my UI Canvas from “Screen Space – Overlay” to “Screen Space – Camera”. Which is how I’m sure it was supposed to be done in the first place (n00b here).

Anyway, in doing so, all of my UI prefabs were coming in scaled to 53.3333%.

I did a quick hack using a script to set those back to 1, but I figured there must be an elegant solution.

I finally found the (so, so simple) solution in the Unity forums.

When instantiating a prefab into the UI, you need to add the “false” argument when setting the parent.


GameObject obj = (GameObject)Instantiate (prefab);
obj.transform.SetParent (parent.transform, false);



GameObject obj = (GameObject)Instantiate (prefab);
obj.transform.SetParent (parent.transform);

Windows 10 High CPU Usage Fix

I noticed recently that Windows 10 was using a high amount of CPU.

Checking Task Manager, could see this was coming from Runtime Broker.exe

A bit of Binging around and I found this solution posted in the Microsoft forums.

Click Start->Settings

Click “System”

runtimebrokerfix1Click “Notification & actions”

runtimebrokerfix2Finally turn off “Show me tips about Windows”

runtimebrokerfix3Runtime Broker should immediately go back to normal.

I’m sure Microsoft will patch this soon, but this should get you by until then.


Removing Malware: Ubuntu and SCCM Endpoint Protection

I had a poor soul who was hit by encryption malware. It appears that the person was infected at home, which encrypted files on that person’s DropBox account, which where then detected by SCCM Endpoint Protection on the company laptop.

To be safe, I wanted to make sure that the point of infection was in fact that home computer, and not a work laptop. However, I didn’t want to boot Windows, just in case.

Here’s what I did:

First I downloaded and booted  copy of Ubuntu 14.04 LTS Live/Installation DVD.

Then I downloaded SCCM Endpoint Protection for Mac and Linux from the Microsoft Volume License Service Center.

(Hint: you won’t see the download separately in the product chooser. Choose the *entire* SCCM Endpoint Protection category, then it will appear as separate download)

The download will be an ISO containing the Mac and Linux clients as well as the documentation. I mounted the ISO and copied the relevant files to a flash drive (since the laptop DVD drive was in use from the Ubuntu Live DVD).

Copy scep.amd64.deb.bin (assuming you’re using 64-bit Ubuntu) from the Linux/[version] folder to the liveuser’s home directory. You will need to make the file executable by running:

chmod +x scep.amd64.deb.bin

Then extract the .deb file by running


and agreeing the license agreement.

Next, I had to futz around a bit with 32-bit compatibility. In the end, this did the trick:

sudo dpkg --add-architecture i386
sudo apt-get update
sudo apt-get install lib32z1 lib32ncurses5 lib32bz2-1.0

You can now (at last) install the Endpoint protection client by running

sudo dpkg -i scep-4.5.10.amd64.deb

Next came a quick configuration of the web interface.

sudo nano /etc/opt/microsoft/scep/scep.cfg

Edit the [wwwi] section.
Make sure you set

agent_enabled = yes
listen_addr = ""
listen_port = port_of_your_choice (i used 8443)
username = "username_of_your_choice"
password = "password_of_your_choice"

Then restart the SCCM Endpoint Daemon

sudo /etc/init.d/scep restart

Make sure you’ve mounted the infected drive. It should appear in the left Launch  bar as a hard drive icon. Click the icon to mount it.

Browse to https://localhost:8443 from that machine.
Log in with the username and password that you set in the configuration file.

Click “Control” in the top menu then “Update” in the left-hand menu. Click the “Update” button to update the definitions.

sccm linux av updateWait until it has finished updating. You can check the status by clicking “View” for the appropriate entry on that page.

Once the update is finished, click “On-Demand Scan” on the left nav bar.

Choose “In-depth Scan” from the dropdown menu.
Under “Scan Targets” enter:


Then click “Scan files”

sccm scan files

This will scan mounted drives including the mounted Windows drive.

To view the progress, click “View” next to the newly created job entry.

That’s it, happy malware removing!


DataTable() Fix in Mono 4.0

If you recently upgraded to Mono 4.0 and you use DataTables to return SQL results (in my case from Postgresql), you may have received a heart attack similar to mine when you were bombarded with the error:

SourceTable is required to be a non-empty string

This appears to be related to Bug #29557:

Thought .Net is perfectly happy allow

DataTable whatever = new DataTable();

Mono will throw an exception.
It is looking for you to name the data table.

To fix, simply change

DataTable whatever = new DataTable();


DataTable whatever = new DataTable("some_name");

Happy compiling!

Making MBPro work on TP-LINK Archer C7

The circle of life continues, and with it another router dies. I’ve had my eye on the Ubiquiti UniFi AC for a while, but the reviews have been very mixed. I’m hoping things get worked out in future firmware, but I can’t see spending that much on a dice roll. Instead,  I figured I’d get an inexpensive hold-over until those are worked out.

On the advice of The Wirecutter, I went with the TP-Link Archer C7 v2. You can imagine my disappointment, then, when I could hardly keep my MacBook Pro (15 inch, Early 2011) connected. Digging through forums, it looks like I’m not alone.

Luckily, upgrading to the v14 firmware seems to have fixed the connection problems. On the downside, my config was completely wiped clean. So if anyone out there is struggling with getting their MacBook Pro working on the C7, the firmware upgrade should do the trick, but make sure you just down your configuration!

NuGet Is Just Better

I was working on getting Postgresql, Visual Studio Remote Debugger, and PHP running on Server 2012 R2 so that I can up my debugging-fu, rather than just relying on Console.WriteLine.

Ran into some DLL hell trying to get npgsql working. I saw NuGet mentioned while Binging for solutions, so I figured I’d give it a try. Where has this been all my coding life?

A few clicks, and remote debugging is up and running. I even copied the compiled files over to my production box, and everything is working fine in Centos on Mono as well. What a great way to spend a Friday morning off!