Website may be up and down over next few months. I'm currently doing a complete overhaul of everything. Going back to simple individual .htm pages, new overall site theme, sanitizing and cleaning up html of all pages and blog posts, attempting to implement a new tooling and publishing system etc etc.

SteelSeries 7H Final Review

Published 4/13/2015 by createdbyx in News | Reviews
Tags: ,

I don't like to write reviews immediately after receiving an item, I'd much rather wait a good long time before posting. I purchased these headphones back in Oct 25, 2010 and am only posting this now because the headphones are still for sale on ncix and else where.

Ultimately they worked great for about 2 years of infrequent use, but considering that they are a mid range (price wise) headset and I paid $130 after tax & shipping, I expected them to last a lot longer.

After the first few months the glue partly separated on one side of the top padding. (See link to picture below)

Then the cord had a line break at the point at which the cable merges with the micro usb style end (the end that plugs into the left ear muff). I fully admit this was most likely my fault with the cord break. 2-3 times I had my headphones on and while sitting in my desk chair I bent over to pick something off the ground and yanked on the cord. Not enough to pull them off my head but enough for me to curse at my own stupidity. :P  Even though they did not break after being yanked they did eventually stop working and I needed to bend and twist the cable to get the sound working.

I finally contacted SteelSeries support and was able to purchase a replacement cable for just $5 through PayPal. I bought 2 cables just in case it broke again and I would not have to go through the replacement process another time.

Over the last year the right ear muff would periodically stop working. I would only have to twist the headphones on my head slightly to get them to work. I suspected at the time that it was a connection issue at the point where the top head piece clips into the right ear muff piece. There are 4 springy pins in the ear muffs that make the connection with the top piece. I thought there was just a lack of solid connection but the problem got a bit worse even after a bit of polishing/cleaning.

They finally died this last week when the right ear muff stopped working completely. I broke down and decided to take the right ear muff piece apart. I inspected the cable where the ear muff passes through the hinge that allows the muff to rotate 90 degrees and there did not seem to be any issues with frayed/pinched wiring etc. Before putting them back together I did adjust the wire routing slightly to give it more slack but it did not fix the issue.

Next I took apart the top piece. There were 3 screws holding either the connection points at both ends of the top piece and there did not appear to be any issues there. Then I peeled back the padding and tried taking off the plastic panel via 4 screws but there is 2 plastic weld points that prevented me from removing the panel. I suspect that the problems are hidden behind the panel but I can't get access to it without destroying the panel in the process.

Here is a picture showing the breaks in the plastic, glue separation, and stress points. 7H Problems Screenshot

The glue separation is due to a combination of the heat from my head and the flex in the top part of the head band and never came apart more then what you see in the picture.

I rated this two stars because a 5 star headset is perfect, every part works until it has been used for a few years, and can take a beating. Based on my experience these are not able to take much of a beating in day to day use. I only ever used them periodically at night when other people were sleeping otherwise I prefer to use my 5.1 surround sound speakers. I used them a few times when traveling and kept them in a laptop backpack but always took care to ensure there safety.

All in all when they did work they worked great, but ultimately I won't be buying another SteelSeries headset product. I have nothing against SteelSeries but I am the type of person that expects more out of the products I buy. I'm sure others have had great experiences with there 7H's but I feel my experience was lacking. Buying headphones off the internet is always a crap shoot as you can only rely on on line reviews, and each person has different hearing and tastes.

I have since purchased a Logitech G430 as a replacement based on the reviews, specs, and the fact it has a 3 year warranty.


Work Delegation

Published 4/8/2015 by createdbyx in News
Tags:

I came across a YouTube video related to delegating your work and thought I would try and break it down more as it video did not quite make sense to me at first.

https://www.youtube.com/watch?v=6dIuA69p6dY&t=3m45s

Me                   
==========           
5 day work week       
8hrs day           
$25hr wage           
$200 a day           
$1000 week income   

Assistant
==========
1 day work week
8hrs day
$30hr wage
$240 day
$240 week income

(me) $1000 a week – (assistant) $240 a week = (me) $760 at end of week

I earned $760 adjusted weekly income and I worked 5 days a week

I work 5 days a week and spend $240 a week for someone to work one day means I technically put in a 6 day work week but personally earned $760 if I used an assistant to work that extra day. Normally I would have had to work a 6 day a week job but hiring the assistant means I can work a regular 5 day a week job.

In reality if I personally worked for 5 days a week without an assistant I would earn $1000 a week. At 4 days a week at $25hr I make $800 week Monday to Thursday. I hire an assistant at $30hr to work on Friday and they get my $200 a day wage for taking my place and working on Friday. Then on Friday I give them an extra $40 for working on Friday so they earn there $30hr/$240 a day wage.

So at the end of the week I make $800 - $40 = $760 income for a 5 day work week even though I only actually worked 4 days. You earn $760 take home at the end of the week after having paid $40 to not work on Friday. The benefit to me is that for $40 a week expense I can work 4 days a week and have someone else do my job on Friday.

For the assistant they are getting paid more money then I do at $30hr plus I am paying them an additional $40 for that day of work. So the assistant doing my job for one day actually gets paid $5hr more than I would have earned.

Just because I paid someone to work one extra day for me what benefit does that get me? I still work 4 days a week and I have to spend money to hire an assistant to work 1 extra day but in reality where does that get me?

Essentially what I am getting from hiring an assistant is 24hrs of free time and it only costs me $40. And having 24hrs of free time allows me to get other things done. This is of course totally dependent on how valuable the work my assistant does and weather or not they are properly trained and capable of performing the assigned tasks. From the assistants perspective they are making $5hr more then I do but are under more pressure to perform in order to keep the job in order to justify the $30hr wage.


When the Oculus Rift became a successful Kickstarter it helped launch a world wide interest again in the promise of VR. Even I began to realize that VR could actually become a thing given the advances in technology. But in the subsequent years I can’t help but see all the familiar patterns and signs that led to the downfall of VR way back in the 1990’s.

I hate to be a pessimist but it’s hard not to be as events of recent years have unfolded. I’ve been wanting to write about VR for a while now and finally decided to write down a few of my thoughts on the subject.

But first lets sing the praises of VR shal we

When I caught wind of the Oculus rift it had taken me by surprise. Having been through the disappointment of VR back in the 90’s I was so close to buying a Forte VFX-1 headset but never did. Being a tech geek and budding programmer at the time I know what I’d be getting and I just couldn't justify the cost at the time. The plan was to wait for a newer better cheaper version to come out but nothing ever did.

Funny thing happened between the 1990’s and when Oculus make VR popular again. I completely forgot about VR and I rarely if ever thought about it. It was a distant bitter memory of what could have been. I have resisted the urge to get a smart phone because of bitter resentment towards the way the cell carriers operate, but along came OnePlus and they won me over with a off contract phone that was high end yet still relatively affordable. The OnePlus One was the first cell phone I’ve ever owned and it’s pretty nice. Not having a smart phone meant that I was out of the loop as to what had been going on in the smart phone space. Oculus made me realize that yes indeed smartphones have exploded and along with them the cost of small high resolution screens, and accurate tracking sensors had come down in price significantly. VR could be a real thing again!

Then along comes Sony and there project Morpheus. Wow I thought there maybe something to this VR thing again. All of what I was seeing from Oculus and Sony were very promising and I was getting excited for VR all over again. Then John Carmack a titan in the gamming industry left ID Software to join Oculus and BAM! Drop the mic. VR IS A THING, AND WILL BE A THING! And if John Carmack just joined Oculus, VR was going to be a thing and it was going to be awesome and Oh my F-ing God just take my money NAO!!!. F-Yeah! VR is back baby! Wuhooo.

Then the news broke and the internet was set a blaze. Facebook acquires Oculus.

/facepalm Kill me now. Just, fuck off. FUCK! And I was on such a high. It was fun while it lasted I guess.” Well that was my initial reaction. You see I am not what you would call a social butterfly like billions of others. I don’t use social media because I know what it really is and it has nothing to do with being social. But that is a discussion for another blog post. After taking a deep breath and as much as I hated admitting it, Facebook acquiring Oculus was for the best. It meant Oculus now had almost limitless funding and could really invest heavily in making the best VR headset money could buy at a price that was reasonable. VR was still on track to becoming a reality and a $2 billion investment means investors will want to see a return on that investment.

Then the unthinkable. Seemingly out of the wood work and not to be out done Microsoft announces the HoloLens a augmented reality headset. F**k, now I have to buy two devices. Grumble Grumble. I see HoloLens as the ultimate productivity tool for programming. Not just programming but programming anywhere not just sitting at my desk. And programming with multiple virtual screens hovering around me and around my house. F-Me that would be so awesome and useful. Not to mention for makers and prototyping in 3D right in front of your physical space.

With all the buzz going around with VR & AR devices being announced every other week it seemed the stink of the 90’s had completely wore off. With so many contenders in the race the future looked promising.

… Then the shit just went nuclear …

HTC announces the Vive and a partnership with Valve. Not only that but every article, YouTube video, and podcast that talked about the HTC Vive about the people who got to experience it said it was the best dam VR experience they have ever experienced, even better then what was on offer from Oculus. Just when I thought my anticipation could not get any higher I, … I may have peed a little. :P Just kidding. It wasn't just the announcement it was also the announcement of a shipping date as well. Fall of 2015. At the time Oculus has been sketchy about release dates as was every other VR, AR device maker.

Ho-ly-crap the tech for the Vive and the demos people raved about made me a believer. Sorry Oculus but you just got sent to the back of the bus. I am now leaning towards getting a Vive headset. Wicked tech and a standing experience in VR along with accurate VR controllers. But it is the one demo that made me a believer. That made my heart pulse and my palms sweat. The Portal Tech-Demo where you had to fix the broken robot blew my mind and made me want more. I can not describe in words how awesome that demo looks. It feeds into so many aspects of a tech geeks life. VR, High tech, Information rich, Programming, Engineering, Puzzle solving, accurate 1 to 1 controller interactions and tooling, simply awesome sauce. Everything I love about tech all in one demo. The kind of demo that makes you evaluate your life choices to ensure that you live long enough to experience it first hand. Hahaha!

With all the buzz in the VR, AR space and with all the various peripherals Virtuix Omni, Cyberith Virtualizer, and Sixense STEM system to name a few, there is a lot to be excited over. But then again after you calm down and take a step back, and begin reflecting upon it all, you find the ghosts from the past reaching up to grab hold of you …

Hardware Costs

In my opinion I think there is a strong chance we are going to see deja vu of 90's VR all over again. Yes the big 3 VR headsets Morpheus, HTC Vive, Oculus Rift & maybe OSVR are great but they are suffering the exact same problems that plagued there 90's counterparts.    
 
For one the price. $200->$500-+ for a head set? Automatic fail. I suspect these companies are seeing through a haze of dollar signs as a new market is on the horizon. I’m sure it’s the same haze that over took there 90’s counter parts. They must think because they get away with selling $700+ dollar smart phones they they can get away selling “premium” headsets.  Instead they need to be thinking critically and making investments toward the long term viability of the platform. The only way I see VR truly taking off in a reasonable time frame (IE: the next 3-5 years) with a sufficient install base is if those who can afford it Facebook, HTC/Valve, Sony sell there hardware below cost or at cost for at least the first year or so.  
 
If they can sell a headset at cost for around $99 then there is a chance it can be fast tracked and go main stream faster rather then slowly seeding units over the next decade as the masses slowly adopt future & cheaper revisions of the product. This is critical because without hardware being in the hands of consumers there is no VR market, and there is zero incentive for developers to spend there time making games for it. Period. You absolutely need to get headsets into the hands of people by any means necessary and plant the seeds of a future in VR that you can then profit from, and build upon in the coming decades. 
 
Stop and think about that rumored $300-$500 “premium” price tag for a HTC Vive for a second. What else can you buy for $500 or less? A Xbox One, PS4, even a decent tablet or computer for a casual PC user. Comparatively the complexity of a VR headset is a very very simple display device, where as consoles/PC's contain a motherboard, memory, graphics cards, hard drive, & a pair of controllers etc. Much larger and much more complicated devices sell for less then $350-$400.

That is to say nothing of how much additional cost there is for VR controllers like the Vive wands, or the PS Move wands, or older hardware like the Razer Hydra.
 
Consumers are not stupid. The assumption that they are stupid is stupid. A $200-500 headset which is nothing more then a few sensors and a screen is not going to be a hot ticket item in the eyes of the main stream consumer.

Just look at Samsung's Gear VR selling for a ridiculous $200 and it is nothing more then a hunk of plastic and 2 lenses with the standard tracking sensors, and no screen. Tracking sensors by virtually all accounts are cheap given the rise of smart phones. And you have to buy a Note 4 just to use the dam thing, bringing the total cost to use it even higher.

And the main stream consumer is who you have to convince to kick start a VR future. If VR is kept delegated to a select few dedicated gamers and the like you can expect the VR market to be a slow decade(s) long crawl to adoption. If initial headsets are at a attractive price point that a mainstream consumer can actually afford, after buying a pc or console to power it etc. then you can expect VR to take off in a big way.

UKRifter put it best when he recently stated in one of his DK2 videos that “… if there thinking of charging that kind of premium, I’m kind of out. You know I’ve got kids & expenses in my life … I love VR but I’m not going to be paying those prices. And you’ve got to realize virtual reality is for the poor…

Planning ahead

As of this posting I have yet to see any of these companies offering preorders. I admit it is April 2015 and the earliest headsets are rumored to be available is late 2015 but as a headset maker it would make sense to start polling people to get actual numbers on how many people are planning to buy a headset at launch. It’s the same problem that plagued the Wii and other hardware with lack of stock at release and months thereafter.

Headset makers need to be smart about how many units they can expect to ship at launch and if it were up to me I would start preorders and taking peoples money by no later then July 2015 or early fall. That way you know exactly how many people are buying and exactly how many headsets you need to get ready for launch as well as how many additional headsets to prep for the main stream retail sales.

It’s 2015 and I can’t stand it when companies don’t take advantage to the global internet enabled customer base by polling or setting up preorders in order to ensure there product is going to be available to people on day one. Creating artificial scarcity or introducing “premium” headsets is I’ll advised when trying to kick start a totally new platform.

Software & Content

You also need to take into account content. There will be very little content available at launch and for the next year or two after launch. Dropping upwards of $500 and having only a small handful of VR games available to play is not going to cut it. Let alone games the consumer would want to play. People are not going to justify a $500 headset if there is no content for it especially main stream consumers. But if it was $99? Yes you would, especially knowing that there is more content on the way.

I have to admit that it frustrates me to no end to see argument after argument on various forums and YouTube about the technical aspects of VR from people who are not technically inclined or trained. They seem to have this view that somehow they are going to see high detail high poly count AAA games for VR games and that it is going to cost a fortune in PC hardware to support or that the consoles don’t have enough horsepower to support VR. A ridiculously stupid argument.

Initial VR games are going to be low poly, cartoony, wow style graphics in order to maximize the supported hardware that is already out there. I suspect you can see the quality of VR games to be a few years behind there standard non VR counterparts.

It’s only been roughly 2-3 years since the Oculus Kickstarter so developers have not been working on VR titles for very long. Taking into account that developers have been waiting to see what becomes of the hardware. I’d say until the big three threw there hat into the ring Sony, Oculus, HTC, developers have really only just started working on new titles in the last year. It takes a good 2-5 years to develop a game and that does not even take into account the additional challenges of developing for VR. Even adapting existing titles to VR can be tricky.

I can’t imaging that there will be no more then say 12 to 24 decent VR titles at launch and a slew of half baked demos and mini games. Not a lot for a consumer to justify a $200-$500 price tag.

Conclusion

Establishing a new VR market is a huge uphill battle. On one hand you need headsets in the hands of enough consumers to make the platform viable for developers to make content for it. On the other there is technically a lot of content out there that can be adapted to VR but with no headsets in the hands of consumers developers are not likely to take the time to adapt there preexisting titles for a platform that has yet to see adoption. Chicken before the egg scenario.

For VR to be considered a success within the next 5 years and achieve widespread adoption headsets need to be priced under the $150 range and absolutely no more. I hope I’m wrong but realistically when I take my VR blinders off and take into account all the factors I just don’t see VR going main stream and achieving mainstream adoption any time soon. Like the next decade soon. Just as I think there is a good chance the big three VR makers could poison the well once more and make VR a difficult sell for widespread adoption.

When you put it all together all I’m really seeing is very high prices to get into VR not to mention additional costs for VR controllers, a decent or high end PC depending on your gamming preferences, and a industry seemingly all to ignorant of history and setting them selves up to fail.

Maybe I’m wrong and I hope I am but time will tell if the next 5 years VR has gotten a well established foot hold and is on it’s way to becoming main stream. As someone who has been let down by the promise of VR in the 90’s I’m not holding my breath.


The unity 5 documentation has a couple of handy reference charts for when you need to help determining what material values to set to get a particular look and feel.

http://docs.unity3d.com/Manual/StandardShaderMaterialCharts.html


With the introduction to Unity 5 there comes some api changes. Specifically this foot note was interesting “[2] in Unity5 we also cache the transform component on the c# side, so there should no longer be a performance reason to cache the transform component yourself.

I decided to test it out by writing a few performance test scripts and comparing performance numbers. Below is a screen shot of my results along with the scripts used.

As you can see caching a reference to the transform component in the Start method then using that reference is still faster then calling “this.transform” directly albeit only slightly by about 10-20 ticks. And calling “this.GetComponent<Transform>()” is almost twice as slow.

Unity5TransformPerformance

The code for the PerformanceTesting class is availible here.

TransformCachedGetComponentPerformance script

    using UnityEngine;

    public class TransformCachedGetComponentPerformance : MonoBehaviour
    {
        private Transform reference;

        /// <summary>
        /// Awake is called when the script instance is being loaded.
        /// </summary>
        public void Awake()
        {
#if PERFORMANCE
            var perf = PerformanceTesting.PerformanceTesting<string>.Instance;
            perf.Create("UnityTesting/TransformCachedGetComponentPerformance/Update");
#endif
        }

        /// <summary>
        /// Start is called just before any of the Update methods is called the first time.
        /// </summary>
        public void Start()
        {
            this.reference = this.GetComponent<Transform>();
        }

        /// <summary>
        /// Update is called every frame, if the MonoBehaviour is enabled.
        /// </summary>
        public void Update()
        {
#if PERFORMANCE
            var perf = PerformanceTesting.PerformanceTesting<string>.Instance;
            perf.Start("UnityTesting/TransformCachedGetComponentPerformance/Update");      
#endif

            var rnd = new System.Random();
            this.reference.localPosition = new Vector3(rnd.Next(-3, 3), rnd.Next(-3, 3), rnd.Next(-3, 3));

#if PERFORMANCE
            perf.Stop("UnityTesting/TransformCachedGetComponentPerformance/Update");
#endif
        }
    }

TransformGetComponentPerformance script

    using UnityEngine;

    public class TransformGetComponentPerformance : MonoBehaviour
    {
        /// <summary>
        /// Awake is called when the script instance is being loaded.
        /// </summary>
        public void Awake()
        {
#if PERFORMANCE
            var perf = PerformanceTesting.PerformanceTesting<string>.Instance;
            perf.Create("UnityTesting/TransformGetComponentPerformance/Update");
#endif
        }
        
        /// <summary>
        /// Update is called every frame, if the MonoBehaviour is enabled.
        /// </summary>
        public void Update()
        {
#if PERFORMANCE
            var perf = PerformanceTesting.PerformanceTesting<string>.Instance;
            perf.Start("UnityTesting/TransformGetComponentPerformance/Update");
#endif

            var rnd = new System.Random();
            this.GetComponent<Transform>().localPosition = new Vector3(rnd.Next(-3, 3), rnd.Next(-3, 3), rnd.Next(-3, 3));

#if PERFORMANCE
            perf.Stop("UnityTesting/TransformGetComponentPerformance/Update");
#endif
        }
    }

TransformFieldPerformance script

    using UnityEngine;

    public class TransformFieldPerformance : MonoBehaviour
    {
        /// <summary>
        /// Awake is called when the script instance is being loaded.
        /// </summary>
        public void Awake()
        {
#if PERFORMANCE
            var perf = PerformanceTesting.PerformanceTesting<string>.Instance;
            perf.Create("UnityTesting/TransformFieldPerformance/Update");
#endif
        }

        /// <summary>
        /// Update is called every frame, if the MonoBehaviour is enabled.
        /// </summary>
        public void Update()
        {
#if PERFORMANCE
            var perf = PerformanceTesting.PerformanceTesting<string>.Instance;
            perf.Start("UnityTesting/TransformFieldPerformance/Update");
#endif

            var rnd = new System.Random();
            this.transform.localPosition = new Vector3(rnd.Next(-3, 3), rnd.Next(-3, 3), rnd.Next(-3, 3));

#if PERFORMANCE
            perf.Stop("UnityTesting/TransformFieldPerformance/Update");
#endif
        }
    }

The fallowing code snip is designed to take in a flat list of file paths (or similar data) and produce a hierarchy of tree nodes representing those file paths.

        /// <summary>
        /// Constructs a nested hierarchy of types from a flat list of source types.
        /// </summary>
        /// <typeparam name="TSource">The source type of the flat list that is to be converted.</typeparam>
        /// <typeparam name="TReturn">The type that will be returned.</typeparam>
        /// <typeparam name="TPart">The type of the art type.</typeparam>
        /// <param name="sourceItems">The source items to be converted.</param>
        /// <param name="getParts">A callback function that returns a array of <see cref="TPart"/>.</param>
        /// <param name="comparePart">The compare part callback.</param>
        /// <param name="getChildren">The get children callback.</param>
        /// <param name="addChild">The add child callback.</param>
        /// <param name="createItem">The create item callback.</param>
        /// <returns>Returns an collection of <see cref="TReturn"/> representing the hierarchy.</returns>
        /// <exception cref="Exception">A delegate callback throws an exception. </exception>
        private static IEnumerable<TReturn> ToHierarchy<TSource, TReturn, TPart>(
          IEnumerable<TSource> sourceItems,
        Func<TSource, TPart[]> getParts,
        Func<TReturn, TPart, bool> comparePart,
        Func<TReturn, IEnumerable<TReturn>> getChildren,
        Action<IEnumerable<TReturn>, TReturn> addChild,
        Func<TPart[], int, TSource, TReturn> createItem)
        {
            var treeModels = new List<TReturn>();
            foreach (var keyName in sourceItems)
            {
                IEnumerable<TReturn> items = treeModels;
                var parts = getParts(keyName);
                for (var partIndex = 0; partIndex < parts.Length; partIndex++)
                {
                    var node = items.FirstOrDefault(x => comparePart(x, parts[partIndex]));
                    if (node != null)
                    {
                        items = getChildren(node);
                        continue;
                    }

                    var model = createItem(parts, partIndex, keyName);
                    addChild(items, model);
                    items = getChildren(model);
                }
            }

            return treeModels;
        }

An example of how one could use the ToHierarchy method would be like this …

        var separator = new[] { Path.AltDirectorySeparatorChar.ToString(CultureInfo.InvariantCulture) };
        // paths varible could be something from Directory.GetDirectories method for example.
        var nodes = ToHierarchy<string, TreeViewNode, string>(
            paths.OrderBy(x => x),
            x => x.Split(separator, StringSplitOptions.RemoveEmptyEntries),
            (r, p) => string.CompareOrdinal(r.Name, p) == 0,
            r => r.Nodes,
            (r, c) => ((List<TreeViewNode>)r).Add(c),
            this.CreateTreeNode);

        private TreeViewNode CreateTreeNode(string[] parts, int index, string source)
        {
            var node = new TreeViewNode() { Name = parts[index] };
            node.Value = string.Join(Path.DirectorySeparatorChar.ToString(CultureInfo.InvariantCulture), parts, 0, index + 1);
            if (index == parts.Length - 1)
            {
                node.Name = Path.GetFileName(source);
            }

            node.IsFile = File.Exists(node.Value);
            return node;
        }

Where paths is a array of file paths from say Directory.GetFiles.


Source: http://stackoverflow.com/questions/616718/how-do-i-get-common-file-type-icons-in-c

    public static class FileIcon
    {
        [DllImport("shell32.dll")]
        private static extern IntPtr SHGetFileInfo(string pszPath, uint dwFileAttributes, ref SHFILEINFO psfi, uint cbSizeFileInfo, uint uFlags);

        [StructLayout(LayoutKind.Sequential)]
        private struct SHFILEINFO
        {
            public IntPtr hIcon;
            public IntPtr iIcon;
            public uint dwAttributes;

            [MarshalAs(UnmanagedType.ByValTStr, SizeConst = 260)]
            public string szDisplayName;

            [MarshalAs(UnmanagedType.ByValTStr, SizeConst = 80)]
            public string szTypeName;
        };

        private const uint SHGFI_ICON = 0x100;
        private const uint SHGFI_LARGEICON = 0x0; // 'Large icon
        private const uint SHGFI_SMALLICON = 0x1; // 'Small icon

        public static System.Drawing.Icon GetLargeIcon(string file)
        {
            var shinfo = new SHFILEINFO();
            var hImgLarge = SHGetFileInfo(file, 0, ref shinfo, (uint)Marshal.SizeOf(shinfo), FileIcon.SHGFI_ICON | FileIcon.SHGFI_LARGEICON);
            return System.Drawing.Icon.FromHandle(shinfo.hIcon);
        }

        public static System.Drawing.Icon GetSmallIcon(string file)
        {
            var shinfo = new SHFILEINFO();
            var hImgLarge = SHGetFileInfo(file, 0, ref shinfo, (uint)Marshal.SizeOf(shinfo), FileIcon.SHGFI_ICON | FileIcon.SHGFI_SMALLICON);
            return System.Drawing.Icon.FromHandle(shinfo.hIcon);
        }
    } 

Did you know that the GUI class has a matrix property that you can use to rotate and scale your gui elements.

The sample behavior that is provided below will scale and rotate a gui label in the center of the screen.

public class GuiMatrixDemo : MonoBehaviour
{
    private float rotation;

    /// <summary>
    /// OnGUI is called for rendering and handling GUI events.
    /// </summary>
    public void OnGUI()
    {
        var matrix = GUI.matrix;

        GUI.Label(new Rect(5, 5, 100, 20), "before matrix");

        this.rotation += 15f * Time.deltaTime;
        var scale = Mathf.Clamp((float)Math.Sin(Time.time) + 1 * 2, 1, 3);
        GUI.matrix = Matrix4x4.TRS(new Vector3(Screen.width / 2, Screen.height / 2, 0), Quaternion.Euler(0, 0, this.rotation), Vector3.one * scale);
        var size = GUI.skin.label.CalcSize(new GUIContent("test string"));
        var rect = new Rect((-size.x / 2f), (-size.y / 2f), size.x, size.y);
        GUI.Label(rect, "test string");
        GUI.matrix = matrix;
        GUI.Label(new Rect(5, 25, 100, 20), "after matrix");
    }
}

CMS theory for websites part 1

Published 12/18/2013 by createdbyx in News | Programming
Tags:

NOTE: Originally written in March 2013 I’m only getting around to posting this now. :(

A long sorted history of technological bloat warez …

I have been kicking the following ideas around for a while now and may soon start to implement them. I have become frustrated with the whole slew of content management systems from Wordpress to BlogEngine. They all have a tendency to lock you into there system and there way of doing things not to mention that there is no guarantee that these platforms will exist or still be popular 5, 10, 20+ years from now as technology is constantly changing. The only thing that has remains relatively consistent over the last 20 years has been html markup. in other words the raw data.

I have maintained the content on this personal website of mine for over a decade even before I registered the domain createdbyx.com. And the one thing that has remained consistent is the manual labor involved every time I decided to port the content of this site over to a new platform from straight html pages, to DotNetNuke, to BlogEngine, to a partial port over to asp.net mvc etc. Porting the content over from one platform to the next, and having to adapt and convert the content from one database to another, one theme/skin system to another simply is not going to cut it any longer. As time goes on and as the volume of content continues to grow porting that content over to yet another platform becomes more and more tedious.

I am a long term thinker when it comes to this type of stuff because I fully intend to maintain and update this site for the next 50+ years and beyond. I care about preserving the content & data on my various web sites in a human readable platform independent format.

I began playing around with some ideas on my test server boobietaunt.com using asp.net mvc. My intension was to develop a single page website that had a similar Live Tiles system as the windows 8 start screen. The whole thing would in fact be rendered via jQuery GET requests and dynamically replace content on the page so there would actually be no page loading. But as I discovered search engines like Google frown on this because it’s too difficult for there systems to crawl & index the website content. And it would ultimately obliterate my SEO search rakings. So that approach was not going to work, not to mention the issues with managing history via JavaScript and browser incompatibility issues etc. *sigh* just kill me nao :(

Although I never fully abandon any of my coding work I did stop working on the asp.net mvc implementation of my prototype createdbyx.com website in favor of a even simpler solution. Or so I thought. With Asp.net mvc again I found myself needing to learn new ways of doing things IE: the mvc way of doing things, and not only that I was using the Razor view engine. As much as I applaud the effort of the asp.net mvc team for helping to make asp.net development more easier and cleaner, I decided to stop using it before I devoted too much time going down that rabbit hole.

With asp.net mvc I was locking my self into a system of Razor syntax and Controllers that was again pushing me further and further away and putting more technological layers in between what I was trying to do which was to simply serve up html content. At this point for reasons unknown I got it in my head that I could write a better system then the asp.net mvc team, and in some ways that’s just what I did. And I did it in around 1000 lines of code. I essentially created a simplified MVC clone with Controllers and Views that could be extended upon. A feat I am actually personally proud of considering it has many of the same core features as mvc but does it in just under 1000 lines of code. I wrote it back when mvc 1 was still in beta and just being introduced and then I abandoned it to go work on my various other projects.

The custom built mvc clone I created is (as of this writing) powering my codefarts.com website. After having dusted it off and used it to power my codefarts website I have again come to the conclusion that it is far too complicated a system to use even though it’s hardly any code at all and the compiled assemblies for handling localization, security, settings, blogging, Views, Controllers, and extensions only amount to a combined 175k worth of (debug) dll files.

CodeFartsbinaryfiles

The future going forward …

Trying to think up a ridiculously simple solution for publishing html content is deceptively complex with so many CMS choices out there, but there have been some new ways of doing things as of late that could change all that.

What if you could power a personal website without databases & without using a traditional content management system? What if you could create and update the content on your local machine, have it automatically backed up to the cloud, automatically synchronized across multiple computers, and have it support versioning & branching like a CVS all without even having to log in. You could search for content you have previously written, you could even create or edit content even if you were using someone else's computer, and it could support multiple contributors to your site.

Now what if you could do all of that using just html files and some popular free software. The answer is pretty much YES!

Here is a potential publishing scenario …

  1. Open up notepad or your favorite html editor and write some content
  2. Save to a specific folder on your computer
  3. The system automatically syncs the file to the cloud and all other computers of your choosing (Google Drive, SkyDrive, DropBox, AeroFS etc) providing automatic off site backups.
  4. Automatic syncing your local working copy of your site file to the webserver involves a program like SyncBack that runs at scheduled times via windows scheduler to sync files to the web server.
  5. A program always running in the background could detect the changes and commit the changes to a local git repository. As well as the option to publish the files to a git server to provide yet another secondary off site back up.

These are just some of the scenarios that you could set up to automatically publish your website content to the web server.

The end goal?

The elimination of any and all unnecessary server side code. If I could think of a way to go back to managing a website using pure html pages I would, but there are some key pieces of server side code that still need to be present like site templates, a blogging and page caching system.

From a security standpoint the most secure code you can write is no code at all. The more code you write the more vulnerable you make yourself. So focusing on your absolute core needs is essential.

--------------------------------------

Since I originally wrote this post back in march 2013 I have since written a entirely new (and more simplified) web server back end and have ported the local copy of my codefarts.com website over to it. I’ll be making future posts regarding this new system and the reasons why it is even better then the custom built MVC clone I described earlier in this post.


The funny thing about brain disorders is that they affect the one part of your body that you are unable to perceive as being afflicted with something. Kind of like how crazy people don’t know there crazy.

When trying to come up with a name for this disorder I asked stack exchange and got a number of helpful responses. Everything from Analysis paralysis to having a Rube Goldberg mentality.

No matter what I do, I can't help but overcomplicate almost every piece of code/application I write. My mind automatically jumps to thoughts of 'OK, I'll need to write this, this, this, and this, and I'll need to make sure to use a repository and MVVM patterns, utilize this and that library' etc. An hour of coding later and it's already spiraling out of control with features & settings that don't need to be there.

This last week I have recently been spending some time on my old WowTracks.com website, in particular the data acquisition utilities needed to capture and store the World of Warcraft armory data. I started by thinking about what kind of app I needed and where it was going to be run. I considered writing it in Unity3D first so that I could port the application to multiple platforms, but unity’s GUI system is too much of a hassle when it comes to presenting large amounts of complex data. I also considered a win forms app, Windows store/Metro, WPF, Silverlight, or even a console application.

I ended up starting to write it as a console application thinking that I did not need to do anything fancy just create a app that schedules downloads of armory url’s at specified intervals. Didn’t take long for me to realize that a console app was not going to cut it.

I then started to write it as a win forms app only to discover that the data binding was severely lacking compared to WPF and I would have to do a lot of manual coding. After scrapping the win forms app in favor of a WPF app things were going fine until I ran into a little issue with my PC randomly freezing up for one second at a time then unfreezing for one second and repeating until I restarted my computer. At first it was confusing because even after managing to get visual studio to quit my system seemed fine but when I mouse over a link in a web browser (IE, Firefox, Chrome) would cause the system to start locking up for one second intervals again.

Initially it seemed like a virus or hacker got into my system as it was only affecting browser applications but from doing some internet searches it seems there is a huge bug with .net 4.0 and WPF. Something to do with UI automation and large complicated visual element trees causing WPF to cause a system to lockup and slow down. This did not bode well for writing the application in WPF.

*sigh* back to writing the app in win forms again. /rollseyes

After a week of back and fourth and nothing to really show for it, most of the time I spent and code I wrote got me further and further away from the end goal. Which leads me the title of this post. I have self diagnosed Anal-para-complica-tard-isis syndrome.  In other words I suffer from analysis paralysis complicatardation with acute over engineering retarex.

Definitions:
Complicatardation (n.) "someone who is retardedly over-complicated"
Retarex (adj.) "something that is complex to a retarded degree."
Portmanteau “a combination of two (or more) words or morphemes, and their definitions, into one new word.”

PS: You see what I did thar? :P


Created by: X

Just another personal website in this crazy online world

Name of author Dean Lunz (aka Created by: X)
Computer programming nerd, and tech geek.
About Me -- Resume