After a failed attempt at getting multi-file upload functionality on an intranet site using the AjaxFileUpload control from the Ajax Control Toolkit I was forced to look for alternatives. The AjaxFileUpload control was designed for ASP.NET so I originally though that it would have been the ideal solution. Unfortunately, I kept getting authentication errors in certain usage scenarios on legacy browsers. As an alternative, I ended using Plupload which is used in content management systems like WordPress. While not designed specifically for ASP.NET, I found Plupload flexible enough to work well with different kinds of server side technologies as well as most browsers. The Plupload does not interfere with ASP.NET’s partial page rendering and can be used on sites with master pages. For legacy browsers, it could be configured to use it’s Flash interface or an even more basic HTML 4 interface (instead of HTML 5 which I used by default). Here is how I set it up.
multipart_params parameter very useful for passing server side attributes for files being uploaded. I defined some of the parameters (e.g.
AllowedFileTypes) on the server side to be able to be consistent later when it comes time to validate the data stream submitted from the client. Also, I found it simpler to implement the
FileUploaded event handler on the client side to be able to indicate the status of the upload to the user (otherwise all uploads look successful unless something crashes on the server side). (more…)
I recently wrote about using the Full-Text search feature built into SQL Server to allow users to search through documents (and the challenge of displaying a summary of the search results). Configuring Full-Text search was a fairly easy process; however, populating the table containing the data to be searched turned out to be a bit more tricky. I wanted to avoid the overhead of using SQL Server FileStream and FileTables and needed control over the text that was extracted from documents. My only option was to implement a custom indexer to extract the text that I wanted from files and then store the text in the database.
My first attempt at extracting text from documents was using iFilters. This is the interface that Windows Search uses to index files. SQL Server use it as well to search through FileTables. I liked the universality of this approach because any file type registered on the server would be parse-able without requiring file-type specific code. After hours of browsing PInvoke.net and looking at working projects online, I finally got a library that compiled and parsed all of the file types that I needed. Unfortunately, I had to give up this approach because of several limitations. First, the code was just not stable enough for my liking. It required reading the registry, loading COM objects, and involved a great deal of unmanaged code. As a result, the library was hard to setup on various servers due to issues with 32 bit vs 64 bit iFilter libraries. Furthermore, because of the unmanaged code, I could not invoke this library from a web page and ended up implementing a Windows Service which indexed the documents offline. The final straw was that Adobe’s iFilter parser for PDF files sucks. It would parse the entire document into one string with no way to discern pages, lines or sections of the document. It was time to try something else. (more…)
Microsoft SQL Server has a convenient Full-Text search capability which is powerful and fairly easy to setup. There is, however, one glaring feature which appears to be missing. SQL Server can rank the search results but it has no built in way to summarize them. So when you store large documents in a table, you can easily tell a user which of these documents match a particular search string but you can’t easily tell the user where to look inside those document. Imagine using an internet search engine and getting a list of links without a summary of those pages.
I figured that I can’t be the first to need this functionality and expected to find plenty of solutions online. One solution that I thought looked promising is the Hit Highlight user defined function. It is easy to setup but I didn’t like the resulting summary or the performance hit (the full-text search runs faster than the summary). Another solution that I found was the ThinkHighlight function which improves performance a bit by implementing the code in a CLR Assembly but it is not free. This site compared the performance of these two solutions and it didn’t seem like the performance improvement of the CLR Assembly justified the cost. Since I didn’t find the solutions that I was looking for, I set out to make one of my own. (more…)
I was working on a project where I needed to store connection strings in a database table for use inside of a reporting engine. SQL Server offers many encryption options from encrypting entire databases to hashing individual strings so it took me some time to come up with an ideal solution. Hashing was not an option for me because I needed to be able to read back the connection strings in clear text before passing them to the reporting engine. Encrypting the entire database seemed like overkill and would create an unnecessary performance hit. Ultimately, I was designing a method to protect the data in one field of a table from users who may potentially have read-access to the database. (more…)
ASP.NET can be used to create powerful and efficient web applications; however, there are times when users will experience a lag during the initial connection to the site. Even though subsequent page requests during the user’s session will be much faster, that initial lag will create a perception of sluggishness. This issue is due to ASP.NET dynamically compiling the site and loading it into the cache of the IIS application pool. The actual duration of the lag depends on many factors including the complexity of the site and the IIS configuration. There are many discussions online about ways to address this perceived problem which include 1) Precompilation prior to deployment 2) configuring IIS to hold on to the cache of compiled web applications longer and 3) using site warmup scripts. In this post, I will describe another option: the use of a site splash screen. (more…)
Whenever I am working on a simple ASP.NET web-page to allow a user to search and edit a specific database table I use the following design template to facilitate implementation. I find that it works for most simple web apps that I need. The sample code files are available for download at the end of the article.
• ASP.NET page that allows users to search data from a single table and edit the results (insert is not covered for simplicity but can be easily adapted to the design)
• Client and server side data validation
• .NET Framework 4.0
• Ajax Control Toolkit
Recently, I was working on a project that involved storing parking restriction information in a database. Here’s how I did it.
- Efficiently store parking restriction details (e.g. No Parking Tues & Fri from 9:30-11:00) associated with work locations.
- Ability to search the data by any combination of parking days and times.
One of the goals that I had for my media center was to have the ability to directly play music and movies from my Linux Server. The PS3 provided half of this functionality by supporting wireless streaming and connections to UPnP A/V servers. MediaTomb filled the gap by enabling my Linux Server to stream my entire media library. Below are the details of my configuration and solutions to some of the issues that I encountered. (more…)
When I wrote my earlier article on Managing Users in a PHP Web Application, I neglected to mention that the authentication mechanism is only acceptable when users are connected over a secure connected (HTTPS) or are on a trusted network (such as a corporate intranet). We went through great lengths ensuring that the passwords are stored securely in the database and that the site is not susceptible to SQL injection or XSS techniques. However, when the login form is submitted over an unsecured internet connection the password is sent back to the server in plain text. Anyone lurking on the network can easily get the login credentials using a network sniffer such as Wireshark. The solution to this problem is to hash the password using MD5 on the client side prior to submitting the login page. This is similar to how we hashed the password stored in the database to prevent people with access to the table from viewing users’ passwords.
<input onclick="document.form.txtPW.value=MD5(document.form.txtPW.value)" name="Login" type="submit" value="Login" />
Recently, I was forced to relocate my Linux server so I decided to try out 1&1’s Shared Web Hosting package. This option was a lot cheaper then paying collocation fees at a server farm and provided a solution that is a bit easier to maintain. The challenge was setting up the environment to have the same functionality that I used to have on the LAMP server in 1&1’s restricted environment. I’ll describe some of the challenges and solutions bellow. This is a follow-up to an earlier guide that I wrote on Configuring a 1&1 Shared Host. (more…)