====== Differences ====== This shows you the differences between two versions of the page.
Both sides previous revision Previous revision | |||
gibson:teaching:spring-2018:math445:lab6 [2018/03/21 11:29] gibson |
gibson:teaching:spring-2018:math445:lab6 [2018/03/21 11:41] (current) gibson |
||
---|---|---|---|
Line 41: | Line 41: | ||
where $[1]$ is an $M \times M$ matrix of ones. Rumor has it that Google uses $\alpha = 0.15$, so use that value and calculate $B$. | where $[1]$ is an $M \times M$ matrix of ones. Rumor has it that Google uses $\alpha = 0.15$, so use that value and calculate $B$. | ||
- | **(d)** Double-check that the sum of each column of $B$ is 1. Again, be clever and get Matlab to do the work, rather than listing out the sums of all the columns and verifying manually that each of the 100 numbers is 1! | + | **(d)** Double-check that the sum of each column of $B$ is 1. Again, be clever and get Matlab to do the work, rather than listing out the sums of all the columns and verifying manually that each of the $M$ numbers is 1! |
**(e)** Let’s assume that all websurfers start at the first webpage, ''x=zeros(M,1); x(1)=1;''. If we iteratively apply ''B'' to ''x'' many times (say 40 times), the resulting vector will give the probability that we end up on a particular website and after a long session of random web surfing. We can do this with a for-loop | **(e)** Let’s assume that all websurfers start at the first webpage, ''x=zeros(M,1); x(1)=1;''. If we iteratively apply ''B'' to ''x'' many times (say 40 times), the resulting vector will give the probability that we end up on a particular website and after a long session of random web surfing. We can do this with a for-loop | ||
Line 56: | Line 56: | ||
**(f)** If you are interested: Our calculation involved two free parameters: the probability $\alpha$ of jumping to an entirely random page in the network, and $N$, the number that specifies the length of our long session of random web surfing. How robust is the page rank algorithm with respect to these two parameters? If you change $\alpha$ to 0.1 or 0.05, do the top 10 pages change? | **(f)** If you are interested: Our calculation involved two free parameters: the probability $\alpha$ of jumping to an entirely random page in the network, and $N$, the number that specifies the length of our long session of random web surfing. How robust is the page rank algorithm with respect to these two parameters? If you change $\alpha$ to 0.1 or 0.05, do the top 10 pages change? | ||
- | How about if you change ''M'' to 100 or 1,000? | + | How about if you change $M$ to 64, 128, 256, 512, or 1024? (some of these larger values might take a really long time). |
---- | ---- |