# The Last Lecture. This lecture will address a few random topics that I d like to mention before we end the semester.

Pages 7
Views 11

View again

All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Description
Graphs and Networks Lecture 4 The Last Lecture Daniel A. Spielman December, 4. Overview This lecture will address a few random topics that I d like to mention before we end the semester. 4. Graph Drawing
Transcript
Graphs and Networks Lecture 4 The Last Lecture Daniel A. Spielman December, 4. Overview This lecture will address a few random topics that I d like to mention before we end the semester. 4. Graph Drawing An amazing theorem of Tutte[Tut63] gives a method of drawing a 3-connected planar graph. Recall that a graph is 3-connected if there is no way of disconnecting it by removingly only or edges. Tutte showed how to construct a planar embedding of any planar graph by rubberbands. To begin, fix any face of the graph, and nail down the locations of the vertices on that face. Make sure to nail them down so that the edges form a convex polygon. Now, replace the edges by rubber bands, and let the vertices settle. Each non-nailed vertex will land at the center of gravity of its neighbors. Tutte proved that this is a planar embedding. Let s do an example. First, I ll generate a random set of points in a box, and then create a planar graph on them by drawing a Delaunay triangulation [a,xy,tri] = delgraph(); clf plot(xy(:,),xy(:,), o ); Lecture 4: December, hold on gplot(a,xy); We will now pick an arbitrary triangle from this figure, nail down the locations of the vertices to be a regular triangle, and solve for the positions of the remaining vertices. ind = tri(,:) ind = 58 m = length(ind); pos = [cos(*pi*[:m]/m); sin(*pi*[:m]/m)] pos = la = diag(sum(a)) - a; la(ind,:) = ; la(ind,ind) = eye(m); b = zeros(length(a),); b(ind,:) = pos; Of course, we would expect a better picture if we used an outside face instead of a triangle from the middle. Lecture 4: December, 4-3 k = convhull(xy(:,),xy(:,));.6 ind = k(:end);.4 m = length(ind); la = diag(sum(a)) - a;. la(ind,:) = ; la(ind,ind) = eye(m); b = zeros(length(a),);. pos = [cos(*pi*[:m]/m); sin(*pi*[:m]/m)] ; b(ind,:) = pos;.4 xyt = gplot(a,xyt) la; I don t want to give you the idea that these pictures are always good. For example, consider the Delaunay graph of triangles within triangles within triangles. 8 6 m = 3; pos = [cos(*pi*[:m]/m); sin(*pi*[:m]/m)] ; for i = :, xy = [xy;pos * i * (-)^i]; end clf plot(xy(:,),xy(:,), o ) a = delgraph(xy); When we draw it using Tutte s algorithm, we get triangles that become exponentially smaller. Lecture 4: December, ind = :3;.4 m = length(ind); la = diag(sum(a)) - a;. la(ind,:) = ; la(ind,ind) = eye(m); pos = [cos(*pi*[:m]/m); sin(*pi*[:m]/m)] ; b = zeros(length(a),);. b(ind,:) = pos; xyt = gplot(a,xyt) la; Graphs From Vectors In the paradigmatic regressions and classification problems of Machine Learning, one is given a list of vectors x,...,x n along with the values of some function on these vectors y,...,y n. One is then given a new vector x and asked to estimate the value of y. In the semi-supervised case, one is only given the values y,...,y k for a k that is less than n. One of the major approaches to solving this problem is to associate a vertex with each vector, to connect them by a graph, and then to use learning algorithms on the graph. There are many ways of choosing these graphs. Standard approaches either place edges between all pairs of nodes within some given distance, or construct k-nearest neighbor graphs. One can then weight the edges according to the distance between the vertices they connect. Standard choices of weighting functions are inverse in the distance, inverse in the square of the distance, exponential in the distance, and exponential in the square of the distance. For example, for some σ. e d(x i,x j ) /σ, I believe that there should be a more systematic approach to doing this. I ll briefly tell you about an approach that Daitch and Kelner and I [DKS9] took to this problem. I don t think that it is the right approach. But, it might provoke the right approach. We will build the graph just from the vectors x,...,x n, ignoring the labels. If we are going to have any chance of solving this problem, the function y will have to be related to the coordinates Lecture 4: December, 4-5 of the vectors. So, we will try to build a graph that will help us solve for the coordinates of the vectors. To make this interesting, we will assume that we know the coordinates of every vector but one, and see how well we estimate that one. Our guess for the coordinates of a vertex i will be the weighted average of the coordinates of its neighbors: w i,j We could then measure the error this provides: x i w i,j w i,j x j. w i,j x j Our initial idea was to choose the graph that minimizes this over all choices for the vertex we leave out: x i w w i,j x j. i,j i It turns out that this problem is degenerate in many ways. But, it becomes reasonable if we weight the contribution of each vertex by its weighted degree: i w i,j x j w i,j x i However, we now get zero if all edge weights are zero. So, we force a non-trivial solution by forcing every vertex to have weighted degree at least. The problem of computing the graph now becomes a semi-definite program, which we can solve. The resulting graph has some interesting properties. On many standard Machine Learning problems it gives better solutions than the other graphs. It also has interesting combinatorial properties. For example, for data lying in d dimensions it has average degree at most d+. For data lying in the plane, the graph is planar. That s pretty surprising. Let s see an example. I ll choose random points in the box, compute our graph, and then draw it... Lecture 4: December, x = rand(,); plot(x(:,),x(:,), o ); a = deggraph(x ); gplot(a,x) Respondent Driven Sampling We will discuss the problem of Respondent Driven Sampling. It is an important problem to solve. But, the mathematical hurdles might be insurmountable. Either way, the claims of statistical validity reported in the literature are clearly unjustifiable. Lecture 4: December, Algebraically Defined Graphs I will say a few words about the types of graphs we define mathematically. References [DKS9] Samuel I. Daitch, Jonathan A. Kelner, and Daniel A. Spielman. Fitting a graph to vector data. In Proceedings of the 6th Annual International Conference on Machine Learning, ICML 9, pages 8, New York, NY, USA, 9. ACM. [Tut63] W. T. Tutte. How to draw a graph. Proc. London Mathematical Society, 3: , 963.
Related Documents

Jul 29, 2018

Jul 29, 2018

Jul 29, 2018

Jul 30, 2018

Jul 31, 2018

Jul 31, 2018

Jul 31, 2018

Aug 1, 2018

Aug 1, 2018

Aug 1, 2018

### J A C U Z Z I C H E M I C A L S G U I D E

Aug 1, 2018
View more...

#### Next Document

Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us. Thanks to everyone for your continued support.

No, Thanks
SAVE OUR EARTH

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!

x