[cs615asa] HW#N: Report - Attend a Meetup/Talk/community event.

Jigneshkumar Patel jpate35 at stevens.edu
Sat Apr 30 17:26:58 EDT 2016

Hello folks,

For the class HW#N, I attended two Meetups,

   - Web-Performance-NY / DNS as a Web Performance Tool @ Spotify on April
   - DevOpsQANJ Meetup #14/Automation with Jenkins and BDD @ Audible Inc.
   on April 6th

I found the groups on meetup.com

*$ DNS as a Web Performance Tool*

After fun speed networking over pizza and beer, Sergey – Event host gave a
brief introduction about the Web Performance – NY group and welcomed new
members to the group he created seven years ago. Later he asked everyone to
‘turn and talk’ to someone sitting next to them and talk about why
performance is important.

Next, Dr. Kris Beevers, Co-founder and CEO of NS1 gave a talk on DNS as a
web performance tool. He began the talk, asking a few basic questions about
DNS, CDN etc. To calibrate himself, according to the audience. He gave a
quick recap of the DNS lookup process, root servers, recursive lookup and
caching resolvers - most important concept especially when talking about
performance. He talked about cache at different stages in DNS hierarchical
infrastructure. He demonstrated how Google Chrome has a built-in DNS cache
for frequently/recently requested domain with default capacity of 1000
entries to optimize the performance. To view the cache entries, one can
type in the below address in the browser.


He also explained how too many references of resources from different
domains would impact the performance. He discussed the most commonly
incurred DNS lookup costs, which depends on many factors such as popularity
of the domain, cache TTL, geographical distribution of users etc. He
recommended to find out assets with slow lookups in the DNS resolution
path. He explained how historically we have seen much higher cache TTL, but
as today, many use the DNS to control the traffic the TTL is dropping for
more control. Later, he demonstrated performance test on nytimes.com
testing it from different locations and getting almost same time to resolve
and how one can optimize DNS lookup performance for their domain with
managed DNS. Also, he talked about the operationally complex, high
performing anycasted DNS networks, which are tightly optimized for latency
and also expensive to build and maintain.

He discussed about the modern approach to real time DNS traffic management
performance data through different network metrics such as latency,
throughput, reachability etc. Lastly, he wrapped up the session with some
takeaways for optimizing performance of the DNS to improve the overall
performance of the application and answered questions from the audience. I
learned that the BIND9 caching server maintains SRTT (smoothed round trip
time) for each authoritative server and uses the server with smallest SRTT

More info: http://www.meetup.com/Web-Performance-NY/events/229618918/

*$ Automation with Jenkins and BDD*

The event consists of two talks:

$ BDD with Codeception by Tom Bartolucci

He talked about how the team of web developers at his company Billtrust
uses Codeception, a PHP-based Behavior Driven Development framework to
test. Tom explained the idea is not to find bugs, but to prevent them.
Codeception allows tests to be written in PHP, which is what its
development team use; it also has hooks for various tools and platforms and
built-in reports. It allows testers to create BDD - style readable tests
and has a variety of uses, including writing acceptance tests, unit tests
etc. He also demonstrated with a live demo of the framework how they
integrated Codeception with Jenkins to execute tests and publish reports.

$ Automation via Jenkins CI Pipeline by Kishore Bhatia

He talked about CI Automation with Jenkins pipelines and Gerrit code
reviews. He also talked about bottleneck of connecting multiple pipelines
together. He focused on a large codebase use case, where many teams
contribute to the code. He recommended that teams should not collaborate on
emails. The teams should get notifications and post-commit validations
should be done by the tool they use. His team uses Gerrit for code review,
Google’s open source project for reviewing every commit before it is
accepted into the code base. He demonstrated the use case during the
session. At the end he talked about parallel commits and test data
management, which triggered lots of questions from the audience.

I decided to attend this event to learn about automation and explore DevOps
workflows and tools. I learned about automated testing, whenever there are
commits there is an automated process of doing all the tests. Also, I
learned about Continuous Integration tool Jenkins, running tests
automatically every time someone pushes new code to the repository, so one
knows after which push the build fail.

More info: http://www.meetup.com/DevOpsandAutomationNJ/events/229777583/

Thank you,
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.stevens.edu/mailman/private/cs615asa/attachments/20160430/97a91198/attachment.html>

More information about the cs615asa mailing list