Cette page appartient aux archives web de l'EPFL et n'est plus tenue à jour.
This page belongs to EPFL's web archive and is no longer updated.


Dec 18th, Yann
"Semidefinite anything"   (title ack.: http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=00319605 ), in 2 parts:
1 - ... Estimation bounds, again  (w/ Martin and Thierry Blu)
For most non-linear estimation problems (such as localization), lowerbounds on the MSE assuming unbiasedness of the estimator are only valid asymptotically.
I will show how an always valid, and in some sense optimal lowerbound can be obtained in two steps:
- Assertion of a conic constraint any feasible estimator must verify (not necessarily unbiased)
- Use of this constraint within a convex optimization problem (Semidefinite program) to obtain an MSE lowerbound 
2 - Maximally compact sequences are maximally painful  (w/ Reza and Martin)
I will first give the mathematical intuition behind the periodic definition of frequency-spread.
Reza outlined in his TAM how to numerically design maximally compact sequences, i.e. sequences having a minimal time-spread for a fixed  frequency-spread.
The Semidefinite program used to compute maximally compact sequences, can be used as a starting point to obtain an analytical formula. 
We obtain that maximally compact sequences are based on Mathieu's Cosine of order 0; which, i believe, is named after Mireille Mathieu for its uncanny resemblance with the shape of her haircut ( http://www.youtube.com/watch?v=IrznmH3XZQ8 ).
Unfortunately, Mathieu functions are not so nice to compute...
(props to Ivan for having the right intuition using a different technique)
Posted by Runwei Zhang at 12:53
Dec 10th, Marta
'The Fukushima Inverse problem'
Knowing what amount of radioactive material was released from Fukushima in March 2011 and at what time instants is crucial to assess the risk, the pollution, and to understand the scope of the consequences. Moreover, it could be used in forward simulations to obtain accurate maps of deposition. But these data are often not pub- licly available. We propose to estimate the emission waveforms by solving an inverse problem. Previous approaches have relied on a de- tailed expert guess of how the releases appeared, and they produce a solution strongly biased by this guess. If we plant a nonexistent peak in the guess, the solution also exhibits a nonexistent peak. We pro- pose a method that solves the Fukushima inverse problem blindly. Using atmospheric dispersion models and worldwide radioactivity measurements together with sparse regularization, the method correctly reconstructs the times of major events during the accident, and gives plausible estimates of the released quantities of Xenon. 


Posted by Runwei Zhang at 12:53
Dec 4th, Reza
"Maximally Compact Sequences"
And here is a short abstract:
For a given time or frequency spread, one can always find continuous-time signals, which achieve the Heisenberg uncertainty principle bound. This is known, however, not to be the case for discrete-time sequences; only widely spread sequences asymptotically achieve this bound. We provide a constructive method for designing sequences that are maximally compact in time for a given frequency spread. By formulating the problem as a semidefinite program, we show that maximally compact sequences do not achieve the classic Heisenberg bound. We further provide analytic lower bounds on the time-frequency spread of such signals. 


Posted by Runwei Zhang at 12:52
Nov 27th, Farid
Private Function Computation on Social Networks
We study the problem of the privacy-preserving computation of functions of data that belong to users in a social network. The objective is to compute a global function of the private information of
all the users, while simultaneously guaranteeing some amount of privacy to the users’ data. A simple example is a poll on a controversial topic. In this case, the private data are the individual users’ opinions
and the global function is the average opinions across all users. In practice, a user is often willing to share her private information with her trusted friends on the social network. These trusted friends
comprise the “circle of trust” of the user. Our approach is based on partitioning the social network of users into such circles of trust, and performing partial computations on each partition. This is in
contrast to popular solutions in which the circle of trust is either the user alone or the user as well as the service provider. Our solution guarantees better privacy for a given computation accuracy compared
to existing privacy-preserving mechanisms. Applications of our method include surveys, elections, and recommendation systems.
Posted by Runwei Zhang at 12:51
Nov 20th, Runwei
Title:Adaptive routing and rate allocation in rechargeable sensor networks
Distributed optimization techniques are widely used in many areas, including wireless network resource allocation, p2p network rate adaptation, large scale data mining and so on. Here, we introduce the techniques based in the field of rechargeable sensor networks. By adapting the routing and communication rates jointly, the varying energy sources, e.g., solar energy, get fully used.
  Some of you might already be quite familiar with these techniques. But for those who do not, I feel the duty to introduce them to you. Hope to see you there!
Posted by Runwei Zhang at 12:51
Nov 13th, Zichong
Title:  Cooperative Visual Monitoring in Energy-Constrained Wireless Sensor Networks
Place:  BC 329
Time:  5:15PM, Tuesday, Nov 13th, 2012
Posted by Runwei Zhang at 12:49
Nov 6th, Feng
 I would like to give a TAM on Recent results on the oversampled binary image sensor next Tuesday at 5h15 at BC 329.
    Recent results on the oversampled binary image sensor
We study a new image sensor that is reminiscent of traditional photographic film. Each pixel in the sensor has a
binary response, giving only a one-bit quantized measurement of the local light intensity. The response function
of this sensor is logarithmic which makes the sensor suitable for acquiring high dynamic range scenes.
In this talk, we will discuss about our recent works like the optimal threshold pattern design,  adaptive threshold
adaptive exposure time, oversampled multi-bits image sensor, oversampled binary sensor with conditional reset,
Posted by Runwei Zhang at 12:47
Oct 30th, Juri
Phase retrieval problems arise in many fields, such as communications and optics. The problem is known since almost a century and has been studied from many points of view. However, there are still interesting open questions, such as the existence of uniqueness conditions and reliable reconstruction algorithms. We considered the case of N-sparse signals in a continuous multi-dimensional domain and we showed the following:
- 1D signals  have a unique reconstruction unless N=6,
- 1D signals with N=6 have a unique reconstruction is at least one delta has amplitude different from the others. 
- 1D signals with N=6 and with all the deltas having the same amplitude are almost surely uniquely reconstructible. 
- 2+D signals are always uniquely reconstructible. 
Unfortunately, we are still looking for reliable reconstruction algorithms. Nonetheless, we show some reconstruction results obtained by the our algorithm for astronomic images. 
Posted by Runwei Zhang at 12:47
Oct 23rd, Ivan
Many applications call for a generalized inverse. Very often, people just use Moore-Penrose, without giving it much thought. There are cases when this is justified, for example ML solution of an overdetermined system with Gaussian noise in the measurements. What about other cases? By dropping the orthogonality constraint implicit in Moore-Penrose, we get mn-m^2 degrees of freedom to play with. Thus we can optimize for desirable properties other than Moore-Penroseness, such as sparsity. But sparsity translates to the amount of computation. I will show how this is useful not only in the overdetermined case, but also for compressed sensing.
Posted by Runwei Zhang at 12:46
Oct 16th, Tao


Monitoring Network Structure and Content Quality of Signal Processing Articles on Wikipedia


Wikipedia has become one of the most widely-used resources of signal processing. Therefore, maintaining a high quality standard for signal processing articles on Wikipedia is without doubt important. Nevertheless, the freelance-editing model of Wikipedia makes quality control and maintenance of these articles a real challenge. In this talk, we would like to show recent advances on the development of algorithms, techniques and tools to monitor the network structure and content quality of signal processing articles on Wikipedia. First, we analyze the network structure of signal processing articles and run network analysis algorithms to rank their importance. State-of-the-art algorithms such as PageRank and HITS are used to obtain the importance rankings of these articles. We evaluate the benefits and drawbacks of these two algorithms. Second, we report the results of crowd sourcing for the top-20 rankings of the articles from researchers in signal processing community. These results show the network structure of current articles could be improved to reflect the importance recognized by community researchers. Third, we use a number of Information Quality (IQ) metrics to rank the content quality of these articles. We demonstrate differential analysis of IQ ranking and importance ranking can unveil articles that have high potentials for improvement. Finally, we give a list of signa

Posted by Runwei Zhang at 12:46
Page : 1 2 Next »