Arm Inc., Austin, Texas
Digital computers have enjoyed unprecedented improvements in performance, power, area and cost due to miniaturization of transistors. As device dimensions reach atomic scales, continuing transistor scaling becomes increasingly complex and expensive. In this talk I'll discuss some of the key limitations of device scaling in the coming years and some potential solutions to get 'equivalent scaling'. The talk will give a basic primer on modern digital design methodology, design-technology co-optimization at advanced technology nodes and some disruptive technologies being explored in Arm Research, such as monolithic 3D-ICs, alternative materials for interconnects, etc.
Amazon (India), Bangalore
The talk focuses on various methods required in Machine Learning (ML) domain. Talk also discuss expectation from freshers in the ML domain.
Reliance JIO Infocomm Ltd
Mobile Technology evolution & comparision; Network Architecture, Mobile Backhaul evolution from TDM to ‘All IP Network’; 4G Mobile Call Flow & Future ahead
It is a common issue in analyzing clustered data that the outcome is associated with cluster size. This talk addresses the informative cluster size problem in linear and generalized linear mixed models when the cluster size is incomplete on all subjects. This problem is motivated by the NICHD Consecutive Pregnancies Study, where the objective is to study the relationship between pregnancy outcomes (continuous or discrete) and parity. It is hypothesized that these pregnancy outcome profiles are associated with the number of births over a woman's lifetime, resulting in an informative cluster size. However, in this study, a woman's lifetime number of births is not observed (censored at the end of the study window). In this paper we develop a pattern mixture model to account for informative cluster size by treating the incomplete cluster size (lifetime number of births) as a latent variable. We compare this approach with the simple alternative where we use the observed number of births at the end of the study as the cluster size. For estimating the population mean trajectory, we show theoretically, with simulations, and in the real data application that the latent variable approach possesses good statistical properties.
Present research on simulating human vision and on vision correcting displays that compensate for the optical aberrations in the viewer's eyes will be discussed. The simulation is not an abstract model but incorporates real measurements of a particular individual's entire optical system.In its simplest form, these measurements can be the individual's eyeglasses prescription; beyond that, more detailed measurements can be obtained using an instrument that captures the individual's wavefront aberrations. Using these measurements, synthetics images are generated. This process modifies input images to simulate the appearance of the scene for the individual. Examples will be shown of simulations using data measured from individuals with high myopia (near-sightedness), astigmatism, and keratoconus, as well as simulations based on measurements obtained before and after corneal refractive (LASIK) surgery.
Recent work on vision-correcting displays will also be discussed. Given the measurements of the optical aberrations of a user's eye, a vision correcting display will present a transformed image that when viewed by this individual will appear in sharp focus. This could impact computer monitors, laptops, tablets, and mobile phones. Vision correction could be provided in some cases where spectacles are ineffective. One of the potential applications of possible interest is a heads-up display that would enable a driver or pilot to read the instruments and gauges with his or her lens still focused for the far distance.
Market games between firms - producing similar goods - are (typically) played with production levels or market prices as firms' strategies.
In this talk, I will first introduce basics of game theory: games, equilibrium and related computational problems. Subsequently, I will talk about equilibria in market delegation games and show how firms benefit by delegating their games.
This talk will not assume any background in Mathematical Economics and will be self contained.
We will perform a card trick and reveal its protocol for you to impress your friends. We will see how a Graph Theory model of the trick enhances understanding of the trick and helps in generalizations. The talk is meant to be accessible to all students on campus.
Modern processor architectures employ optimizations such as store buffers. Such an optimization, however, may result in program executions that violate Sequential Consistency. In other words, program statements may appear to have been reordered violating the program order. Some of these executions may result in safety property (assertion) violation. Architectures provide fence instructions(memory barriers) that can be inserted to avoid any unwanted reordering. Too many fences may degrade performance drastically whereas too few fences may result in a buggy behaviour. Due to non-determinism in scheduling and reordering, it may be very difficult for even an expert programmer to insert fences in an optimal manner.
Automated techniques have been proposed for property-driven fence insertion that repairs a concurrent program through fence insertion by suggesting optimalfence placement for a given architecture.In this talk, I will introduce a technique we call "Reorder Bounded Model-Checking" (ROBMC). ROBMC introduces a new parameter in the world of bounded model checking. We show that ROBMC based approach outperforms traditional property-driven fence insertion techniques. This work has been presented and published in FM 2015.
Wizard Merlin wants to convince king Arthur that a mathematical statement is true, but Merlin does not want reveal anything other than the validity of the statement. Can he do that? In 1989, Goldwasser, Micali and Racoff introduced the notion of Zero Knowledge Proofs. These are proofs that are convincing and yet does not reveal anything other than the validity of the statement. In this talk we discuss zero knowledge proofs and its importance in computer science. Zero Knowledge Proof is one of the concepts for which Goldwasser and Micali received Turing Award in 2012. If time permits we will discuss a recent connection between zero knowledge proofs and minimum circuit size problem This part is based on a joint work with Eric Allender.