Now from test pc, open youtube and play any video, after it download completely, delete the browser cache, and play the same video again, This time it will be served from the cache. You can verify it by monitoring your WAN link utilization while playing the cached file.
dear janzaibi have to inform u that i have storeurl.pl patren to cache youtube and it only cache 51 sec of video segment but after adding ( set quick_abort_min = 1MB ) and ( set the range_offset_limit = 10 MB ) when i download any video of youtube once with idm and re-download from other pc with idm it provide full video from cache so i think storeurl.pl script had some thing missing to provide full video cache in flash player. if i m wrong then please correct me. thanks
I will not recommend you to do IDM caching I am sure you are using range_offset_limit dircetive in squid, if YES, then stop using it, it can give you negative impact regarding un necessary WAN link utilization.Look at -dont-cache-idm-downloads/
I have not faced such issue, I implemented nginx base youtube caching method at a friends network and since long its working great for caching youtube videos near perfect. Also it stops downloading the video if the user stop or abort the video. as youtube have changed there method of delivering videos in chunks (1.7Mb each) so its become easier and manageable for cache admins)There must be some squid configuration mistake.
1) Nginx files donot pass through squid , it goes throught nginx, thats why its not marked as cache HIT thus it cannot be bypassed by queue limit, possibly there could be workaround but I am not aware of it yet, as Youtube is still banned in pakistan therefore cant test it.2) You cant abort an video download even if its not viewed fully by client, Its nginx related issue, no workaround for it yet.3) ubuntu 12 have squid 3.x by default, you cant install squid 2.7 with atp-get, however you can download squid 2.7 source source and compile it, this way it will work.4) NGINX is the better way.5) Yes you can route specific port/destination webs request to specific wan. Use Mark and Route method.
# After configuring cache_mem, did you restarted the squid service? How much active users accessing the proxy box ?# 32G ram will make a good performance effect by utilizing more and more RAM for cache.Make sure you use 64bit of Linux OS , either its ubuntu or whatever flavor of *nix# Yes you can use 2 HD , one for default squid cache, and second for youtube.# It will take a lot of time to fill-up 2 TB of space just for youtube cache, except if you have large number of users. anyhow When the drive will fill-up, nginx will not auto clear it up, You can create a simple bash script that can delete files older then X days or any file that have not been access from past 2 months. then schedule it to run on daily basis, or weekly.Read the following, basically it was designed for windows, but using same logic, you can create your own bash script. in this guide look for subject To delete files olde then X days using FORFILES -file-to-delete-files-older-then-n-days/
sayed please help,nginx still downloading even if i stop serving videosi mean if i put 100m it take themi have 2 eth fake and realso i remove the fake cable from the cache but it still taking bandwidth from the real ethwhy ? thank you
Videocache: When user watches any video, the squid download the video , and at same time, video is being downloaded by VC plugin (python script and apache) in parallel. So it consumes double bandwidth. For example if 10 users are watching 10 different videos, 20 videos download bandwidth will be consumed.So in this case, VC sucks
The students have found out that they can get noticeably faster web access when they are configured to use the cache-proxy. This was intentional, as their networking people really would like to prioritize interactive web-traffic over bulk-downloads. By setting up the cache-proxy for high priority they do this.
Squid requires a full copy of the same object to already exist in its storage for a partial request to be satisfied at a fast speed from the cache. Suppose a proxy video user is watching a video stream and browses to a different page before the video completely downloads. In that case, Squid cannot keep the partial download for reuse and simply discards the data. Special configuration is required to force such downloads to continue and be cached.
Apt-cacher is a caching proxy for Debian packages, allowing a number of computers to share a single local cache. Packages requested from the cache only need to be downloaded from the Debian mirrors once, no matter how many local machines need to install them. This saves internet bandwidth and improves performance for local users, and reduces the load on the mirrors. 2b1af7f3a8