Pages

Wednesday, January 23, 2013

Tweevio - What and how?

I really like doing something with Twitter API. Last year, while I was trying to learn Python, I wrote a script that receives the trending topics in Twitter. (You can see the code here.)

Now I'm planning to do a project with using Twitter API again. Actually, I'm thinking on it very long but I have a chance to make it happen at this moment. I even gave a name for it: Tweevio.

Tweevio will show the last Youtube videos that are shared in Twitter. Yes, that simple.

After this convincing explanation of the project, let see how I'm planing to do this project:

I think I have to use Youtube API for playing videos. Also, I'm planning to use jQuery library. Actually, I don't know jQuery but in CodeAcademy there are really good tutorials. I will follow those tutorias to have an idea. The reason I choose jQuery is to make my job easier on sending HTTP request and JSON parsing.

Before starting this project, I separated the things that will be done into simple tasks. Here they are:
  • Send HTTP request to Twitter Search API. And store the response in a variable. (Youtube.com, youtu.be will be searched in tweets.)
  • Parse the JSON which will come as response and extract Youtube links from it. (Maybe using regular expressions.) 
  • Write a simple website which just has an input box and button. This website will show the Youtube video whose url is entered in input box.
  • Prepare a very simple website with Twitter Bootstrap. (Use placeholder images.)

This is the design of the site created with Balsamiq Mockups.

Tuesday, January 22, 2013

Plans for semester break

  • I want to follow Scala course on Coursera which is given by Martin Odersky. I have no functional programming experience. I want to enter into this area with this course.
  • I will look over Play! framework which is used to build web applications with Scala and Java.
  • I will also follow jQuery and API lessons in CodeAcademy. In CodeAcademy, in order to pass a course, you have to write codes and complete missing parts. I think it's a great way to learn.
  • I have a little project which uses Twitter API. I want to complete it. Also I will write a blog post about it.
To accomplish all the things above can be difficult. Because, I have bad experience about these kind of todo lists. I can get bored and leave the things that I'm working on. Related to this, a few days ago I saw a video on Facebook. I think the guy in the video has a point. I will give ear to things the guy emphasizes. I think you should check this out, too.


Friday, January 18, 2013

Download Manager Project - Bash scripting

The download manager is the project that I made for UNIX Scripts and Utilities course. I used bash scripting and dialog. Dialog is for the GUI of the project.

The reason that I chose this project is sometimes I download lecture slides of courses that I'm interested in. For example, since MIT opened the most of their courses, I want to download the lecture files to see how they go over the courses and what the homeworks are. Downloading whole files by clicking every link was exhausting. At this point my project is lending a hand to make things easier.

Actually you can download files from a web site by using wget command. To download specific types of files like pdf files, you can give the extension of the file with -A parameter. Let's see an exampe of this.

       wget -r -A.pdf http://www.cs.ozan.edu/~yildiz/prog101/

This command will download all pdf files in the given website which is supposed to include lecture slides or homeworks.

Since the aim of the project is using grep, sed, awk, cut and provide GUI to the user, I should have used something else. And I came up with this:

How does the script work?
  • After a user enters comma-separated urls of web pages, the script keeps a list of urls by using awk command.
  • The script has a loop to iterate over those urls.
  • Then, the source of a web pages is downloaded with wget -k -O command. -k parameter is to convert all relative paths to absolute path.
  • Then, grep extracts these absolute paths by looking file extensions and write them into a file. (The extensions of the files that are going to be downloaded is predefined.)
  • Another loop downloads all files one by one. Also download bar is increasing when a file is downloaded.
  • The script also keeps the history of the downloaded files.
And of course images of the project:

This is the part that user enters links.

Downloading screen. We are able to see the file that is being downloaded.

I put the codes to my gist. Click here to see the code. You can add new file extensions to be downloaded by modifying extensions variable.