• Women's Writing

  • Aim and objective: The course aims to give an overview of an important trend in high performance computing – GPU programming. GPUs (graphics processing units) are special purpose hardware originally designed for graphics and games; however GPUs are very efficient in solving some general-purpose computing problems.

    General Purpose Graphical Processing Units (GPGPU) primarily refers to the use of GPUs for computationally intensive mathematical and scientific computing. The enormous peak performance of GPUs for arithmetically intensive computations relatively at a much lower cost compared to CPUs makes GPU computing a very attractive new alternative for computationally demanding problems.

    The course will help the students in understanding the basic concepts of GPU programming, CUDA (Compute Unified Device Architecture) parallel computing platform and hands-on experience on implementing some standard CUDA programs. Finally the course will give a brief overview of the current applications and future trends of GPU computing in scientific research.

    Main Topics to be covered are: 1) Introduction to parallel programming. 2) GPU nVIDIA architecture. 3) GPU/CUDA programming model. 4) Overview on GPGPU computing techniques. 5) Optimizing GPU Programs. 6) The Future of GPU Computing and new GPU features. 7) CUDA example codes and practicals.

    Self enrolment: IT477-Introduction to GPU Programming Fall 2014
  • IT468 Natural Computing Fall 2014

    In the last 50 years Information and Communication Technology (ICT) has had a great impact on our society. The most profound and accelerated impact of ICT can be seen in the last decade in the form of cell phones, connected computers and Internet. We even have a virtual currency. ICT is an interdisciplinary discipline combining IT (Information Technology) and CT (Communication Technology). IT has its root in computer science and CT has its root in theory of communication. Both the fields now can be seen as two sides of the same coin. Both deals with information, in IT we store (send information from now to then) and manipulate the information and in CT we send information from here to there (communicate). The mathematical principles of ICT lies in theoretical computer science (Turing machine) and information and coding theory (work of Shannon and Hamming). Realization of ICT is via logic gates and circuits in the area of Electronics and VLSI. If you look around the Nature many times you feel: What are the principles of Natural ICT? Can we use these principles to create Natural ICT engineering? 

    Natural computing is a recent branch of computer science where we are learning from the nature on how to compute with natural living things such as DNA, protein, bacteria, etc. We want to solve complex problems with the help of DNA computer or bacterial computer or chemical computer. We want to store our data on such living things. So we require molecular/natural algorithms and natural error control.

    In July 2009, scientists have shown that a bacterial computer can also solve simple Hamiltonian path problem. In June 2011, Erik Winfree has built the largest DNA computer for finding square root.  In July 2012, Martin Fussenegger's group has built single cell mammalian biocomputers. We also have Skin Computing, Human Visual Computing etc...

    Keywords:

    Unconventional Computing, 

    Optical computing,

    Quantum computing

    Chemical computing

    Natural computing

    Biologically-inspired computing

    Wetware computing

    DNA computing

    Molecular computing

    Amorphous computing

    Nano Computing

    Reversible computing,

    Ternary computing

    Fluidicsanalogue computing

    Domino computation.

     

    Billiard-ball Computing,

    Swarm Intelligence

    Morphological Computing 

    Liquid Computing

    Peptide Computing 

    Membrane Computing 

    Bio-molecular Computing

    Bacterial Computing

    Ant Computing  ...

    Monkey Computing...

    Elephant Computing... 



    Guest access: IT468 Natural Computing Fall 2014Self enrolment: IT468 Natural Computing Fall 2014
  • Life is the most complex phenomena in the universe. Each of us has the fundamental questions all the time: What is the purpose of life? In 1944 Schrodinger wrote a book known as “What is life?” At the dawn of the 20th century due to the advancement in the Human genome project tremendous opportunities are now available to look at the DNA the actual software of Life (now known as Life 1.0). As an engineer a natural question that we ask our-self is: How to create life from the known principle of Life 1.0? What are the basic building blocks of life? The answer to these questions has resulted in the new area at the horizon known as synthetic biology or Life 2.0. Since biology revolves around the central dogma that says that the DNA stores the blue print of life and every time a species want to make a protein (the working molecule of life) it reads the portion of DNA, makes RNA from it, which is then mapped to primary sequence of protein via Genetic code. Hence if we want to make life, we need to build basic building blocks synthetically in the lab. There has been a lot of progress in another direction known as DNA nano-biotechnology where one can build stuff at nano-scale using raw material as DNA. DNA self-assembly, and in particular DNA origami, has come a long way and many interesting things have been created both at 2D and 3D level. Creating synthetic DNA strands that can store data or that can carry small drug molecules are some examples of its applications. Synthetic biology could be very useful for mankind for example one can create organisms for cleaning water, having rain, producing energy, producing oil etc. The applications are endless. Recently a group in Israel has given birth to biological semiconductors.

    The term “synthetic biology” was first used in 1910 and 1912 and later on in 1978. At the beginning of 21st century most of the work started by looking at gene expression at different levels. Michael Elowitz did the first work on Repressilator in 2000, which was further extended by many people by modeling and analyzing the gene expressions using different mathematical techniques. In 2000 in a pioneer work Tim Gardner and Jim Collins engineered genetic toggle switch also at the same time researchers reported synthesis of 9.6 kbp (kilo base pair) Hepatitis C virus genome from chemically synthesized 60-80 mers. Further in 2002, researchers reported synthesis of 7741 base poliovirus genome as the second synthetic genome. In 2003 the 5386 bp genome of the bacteriophage Phi X 174 was synthesized. Later in 2006 same team constructed and patented synthetic genome of bacterium Mycoplasma laboratorium. Craig Venter’s group reported first kind of synthetic cell in May 2010 using 40 million US dollars. A most recent discovery has been reported in May 2014 where a semi-synthetic E-coli bacterium has been created using 6 alphabets than usual A, C, G and T.

    More in the course…so fasten your seatbelt.

     This course is designed for Btech, Mtech and PhD students. This course is useful for any ICT student (both computer science students and communication students) and to anyone who want to learn about upcoming field of bioengineering. 

    Guest access: SC462 Elements of Synthetic Biology: Life 2.0 Fall 2014Self enrolment: SC462 Elements of Synthetic Biology: Life 2.0 Fall 2014