After using MS Outlook for many years, I have grown accustomed to being able to archive old contacts to organize my contact list. When I switched to Gmail, it was hard to believe that this basic functionality was missing. Yes, I could export the contacts as a backup and just delete the old ones from Gmail but this is not as convenient as having all contacts readily available. Ultimately my biggest issue wasn’t that I had hundreds of contacts in Gmail while using the web-client, it was that all of those contacts got synchronized to my portable devices. I have seen suggestions online to create a contact group in Gmail for just the contacts that are desired on the phone (Android and iPhone allow filtering by contact groups) but that would require me to create a separate contact group for each portable device. My preference is to synchronize all of my contacts (regardless of which groups they are tagged with) except those that I no longer use.
The solution that I found is embarrassingly simple. The trick is that the parent contact group in Gmail (i.e. “My Contacts”) does not have to include all of the members of the sub-groups as the interface suggests by default. You can create a sub-group called “Archive”, tag contacts with that sub-group, and remove those contacts from the “My Contacts” group. Then, on a portable device, filter the contact list to only “My Contacts” which, by default, includes all the sub-groups except the “Archive” group which we explicitly removed. The other benefit of this approach is that the “Archived” contacts can still be tagged with other groups and are still readily available on portable devices when explicitly filtering to those groups (they just don’t show up in the main contact list). Wunderbar!
I last wrote about Designing an HTPC back in 2010 and 2011 so I figured that it’s time for an update. Overall, the setup is still very much functional but can benefit from some new technologies. Below are some upgrades that I have made over the years.
Waiting for the HTPC to boot was one of my biggest annoyances in its day-to-day use. Fortunately, due to rapidly declining prices of Solid State Drives, there is a quick and inexpensive solution. The challenge was figuring out how to add another drive into my existing Antec MicroFusion Remote 350 HTPC case which only had one 3.5″ bay. I didn’t want to replace the existing 2TB HDD which was good enough for storing media or loose the bluray drive so I looked for another place to mount a 2.5″ SSD. The easiest solution was to use one of the spare PCI slots and mount the new SSD drive to it using a bracket (several options are available on both Amazon and eBay). The drive that I picked out was the Samsung 840 EVO-Series 120GB 2.5″ SSD which I think is the perfect size for storing the OS and applications partition.
With the second drive installed, migrating the operating system to it was trivial using the free Partition Master program from EaseUS. The program has an option for OS migration with a straight forward wizard to guide the process (as well as SSD optimization). Just note to size the Windows recovery partition accordingly because Partition Master does not do it correctly (100MB for Windows 7 and 250MB for Windows 8). It took about 15 minutes to copy over the OS partition from the HDD to the SSD. So, for about $100, this upgrade reduced the HTPC boot time to under 10 seconds while making the box run slightly quieter.
Wireless Speakers (Multi-Room Audio)
The multi-room audio setup that I had relied on a patch to XBMC which allowed simultaneously outputting audio to two devices (a feature Windows had in XP that was later removed in Windows 7 and up). Unfortunately, until recently, this patch was available only for XBMC versions 10 and 11. I looked for alternatives using custom sound drivers but it just wasn’t the same. Fortunately, a new dual audio patch is now available for XMBC 12.3 so I can finally upgrade to the latest version without loosing existing functionality.
After a failed attempt at getting multi-file upload functionality on an intranet site using the AjaxFileUpload control from the Ajax Control Toolkit I was forced to look for alternatives. The AjaxFileUpload control was designed for ASP.NET so I originally though that it would have been the ideal solution. Unfortunately, I kept getting authentication errors in certain usage scenarios on legacy browsers. As an alternative, I ended using Plupload which is used in content management systems like WordPress. While not designed specifically for ASP.NET, I found Plupload flexible enough to work well with different kinds of server side technologies as well as most browsers. The Plupload does not interfere with ASP.NET’s partial page rendering and can be used on sites with master pages. For legacy browsers, it could be configured to use it’s Flash interface or an even more basic HTML 4 interface (instead of HTML 5 which I used by default). Here is how I set it up.
multipart_params parameter very useful for passing server side attributes for files being uploaded. I defined some of the parameters (e.g.
AllowedFileTypes) on the server side to be able to be consistent later when it comes time to validate the data stream submitted from the client. Also, I found it simpler to implement the
FileUploaded event handler on the client side to be able to indicate the status of the upload to the user (otherwise all uploads look successful unless something crashes on the server side). (more…)
I recently wrote about using the Full-Text search feature built into SQL Server to allow users to search through documents (and the challenge of displaying a summary of the search results). Configuring Full-Text search was a fairly easy process; however, populating the table containing the data to be searched turned out to be a bit more tricky. I wanted to avoid the overhead of using SQL Server FileStream and FileTables and needed control over the text that was extracted from documents. My only option was to implement a custom indexer to extract the text that I wanted from files and then store the text in the database.
My first attempt at extracting text from documents was using iFilters. This is the interface that Windows Search uses to index files. SQL Server use it as well to search through FileTables. I liked the universality of this approach because any file type registered on the server would be parse-able without requiring file-type specific code. After hours of browsing PInvoke.net and looking at working projects online, I finally got a library that compiled and parsed all of the file types that I needed. Unfortunately, I had to give up this approach because of several limitations. First, the code was just not stable enough for my liking. It required reading the registry, loading COM objects, and involved a great deal of unmanaged code. As a result, the library was hard to setup on various servers due to issues with 32 bit vs 64 bit iFilter libraries. Furthermore, because of the unmanaged code, I could not invoke this library from a web page and ended up implementing a Windows Service which indexed the documents offline. The final straw was that Adobe’s iFilter parser for PDF files sucks. It would parse the entire document into one string with no way to discern pages, lines or sections of the document. It was time to try something else. (more…)
Microsoft SQL Server has a convenient Full-Text search capability which is powerful and fairly easy to setup. There is, however, one glaring feature which appears to be missing. SQL Server can rank the search results but it has no built in way to summarize them. So when you store large documents in a table, you can easily tell a user which of these documents match a particular search string but you can’t easily tell the user where to look inside those document. Imagine using an internet search engine and getting a list of links without a summary of those pages.
I figured that I can’t be the first to need this functionality and expected to find plenty of solutions online. One solution that I thought looked promising is the Hit Highlight user defined function. It is easy to setup but I didn’t like the resulting summary or the performance hit (the full-text search runs faster than the summary). Another solution that I found was the ThinkHighlight function which improves performance a bit by implementing the code in a CLR Assembly but it is not free. This site compared the performance of these two solutions and it didn’t seem like the performance improvement of the CLR Assembly justified the cost. Since I didn’t find the solutions that I was looking for, I set out to make one of my own. (more…)
I was working on a project where I needed to store connection strings in a database table for use inside of a reporting engine. SQL Server offers many encryption options from encrypting entire databases to hashing individual strings so it took me some time to come up with an ideal solution. Hashing was not an option for me because I needed to be able to read back the connection strings in clear text before passing them to the reporting engine. Encrypting the entire database seemed like overkill and would create an unnecessary performance hit. Ultimately, I was designing a method to protect the data in one field of a table from users who may potentially have read-access to the database. (more…)
ASP.NET can be used to create powerful and efficient web applications; however, there are times when users will experience a lag during the initial connection to the site. Even though subsequent page requests during the user’s session will be much faster, that initial lag will create a perception of sluggishness. This issue is due to ASP.NET dynamically compiling the site and loading it into the cache of the IIS application pool. The actual duration of the lag depends on many factors including the complexity of the site and the IIS configuration. There are many discussions online about ways to address this perceived problem which include 1) Precompilation prior to deployment 2) configuring IIS to hold on to the cache of compiled web applications longer and 3) using site warmup scripts. In this post, I will describe another option: the use of a site splash screen. (more…)
I love my Roomba but there are many places where it cannot reach. A full sized vacuum would be overkill for my apartment and would take up too much space. This is why think the Dyson DC44 Animal is a perfect companion to the Roomba. It is small, powerful, and conveniently hangs (and charges) inside my closet. I can use it as an upright vacuum for spots requiring more suction than the Roomba can muster and as a small hand-held to get under the bed, inside closets and bathrooms, and for the car. It holds its charge well, is light weight, and is easy to clean. I sometimes find myself walking around the apartment with the DC44 in hand looking for dust bunnies to zap. I don’t think the DC44 could serve as a whole-house vacuum on its own, but, with the Roomba at its side, I think it will suffice. Sure it is pricy but you pay for the quality and convenience (and you can sometimes find it on Amazon for 25% off retail price). It’s also cheaper than any of the Dyson full size/upright models. Neither the Roomba nor the Dyson DC44 require purchasing additions accessories such as vacuum bags or lubricants so I hope the two can live long and productive lives with minimal maintenance costs.
Whenever I am working on a simple ASP.NET web-page to allow a user to search and edit a specific database table I use the following design template to facilitate implementation. I find that it works for most simple web apps that I need. The sample code files are available for download at the end of the article.
• ASP.NET page that allows users to search data from a single table and edit the results (insert is not covered for simplicity but can be easily adapted to the design)
• Client and server side data validation
• .NET Framework 4.0
• Ajax Control Toolkit
Recently, I was working on a project that involved storing parking restriction information in a database. Here’s how I did it.
- Efficiently store parking restriction details (e.g. No Parking Tues & Fri from 9:30-11:00) associated with work locations.
- Ability to search the data by any combination of parking days and times.