I thought I would point out a YouTube video that I think is really cool. It's on a channel I follow by Daniel Riley, a guy in middle America who flys remote piloted aircraft. The videos consist of him flying various things for entertainment (his and ours) and sharing his adventures. The FPV capability gives him a first person video feed from the aircraft itself which he views through a special headset, the quality of this video is often poor. But he also captures high quality video with onboard storage that he can then edit in to his videos. The sophistication in the videography and editing behind his videos is quite a feat that could probably go largely unacknowledged. I notice it, and it's awesome!
Daniel gets to apply RF in practical ways that are so different to what I do, I really am in awe! Different frequencies from all kinds of antennas with different applications - including aircraft control, FPV video, backup links and probably others that are less obvious...
This particular video is especially cool because he filmed it, mid flight, during the Total Solar Eclipse in the USA on 21st August 2017.
It's funny that Daniel is so dedicated to the filming of such a great event (and, I agree, it was completely worth it) that he left his friends who were celebrating the Eclipse together in the forest to find a decent flight area. Thanks so much!
I haven't yet had the pleasure and experience of being present for a total eclipse that I can remember. But this most recent one in August 2017 has brought so much great footage to the Internet that we all get to watch it at least in proxy around the world.
It isn't Wi-Fi as we know it... but it's a pretty cool use of RF.
I highly recommend you check out and subscribe to rctestflight on YouTube if you liked this video and want to see more.
Risk should be the focus of IT and Network Engineers in 2017. It sounds boring or maybe like it’s "not my problem” but many of the threats facing organisations today could be mitigated by careful analysis and risk mitigation strategies. I often use Stuxnet as an example in what is ultimately a flawed analogy when talking to clients about risk. Stuxnet most probably did not reach it’s target via a network but it is the most memorable internally resident threat that I can think of. It’s threats of this type which are currently the most overwhelming risk to organisations' data and productivity. The threat acts from within the system and sometimes it’s successfully authenticated to the network. We need to think about networks and interconnectivity differently.
The benefits of successful risk mitigation is the reduction in downtime for users or maybe t’s zero financial loss to an organisation. Pretty compelling, right?
The problem we face with computers, as we have learned over time, is that we cannot make risk equal to zero. We fight the battle with one threat and a new one appears. Software and hardware inherently holds a flaw somewhere along the way that can be compromised and there is a growing bucket of humans who are incentivised to find and take advantage of such flaws. Though we can’t win every battle it should be our duty to reduce the impact of the battles we happen to lose.
Through frequent analysis of risk we can strive to better prioritise our tactics and tools for risk mitigation.
Before WPA2 was available I worked my way through a Microsoft document to implement 802.1X for my wireless network. I walked through what felt like hours of steps to deploy a root CA certificate to clients, sign a certificate for a Windows IAS RADIUS and create a few rules that allowed devices to authenticate to the network. At the time it was a huge feat and a largely un-acknowledged win for security in my organisation. It was probably the best network access control of the time short of some attributes to define appropriate VLANs. When WPA2 emerged I was amazed at how simple I could convert my previous configuration to support the new standard. Back then it was all new and mostly magical, to me.
Today I consult for organisations who need to enhance their network access as a basic step in risk mitigation. We make the authentication process more resilient by intelligently profiling devices as they connect to the Wi-Fi and flagging or denying devices that don’t match that profile. We enhance the authentication process so that not only does the server authenticate the client but the client mutually authenticates the server to be sure it is good. I particularly love drawing huge whiteboard diagrams of Wi-Fi based client devices and user types so we can profile who and what connects and best determine the appropriate level of access granted. For most these alone are large leaps forwards, ACLs and traffic controls are a thought for the future.
In my opinion layer 2-4 traffic rules should be applied as a baseline to enhance the efficiency of Wi-Fi networks as well as reduce impact of network traversing threats. With experience and confidence in building networks with these types of controls Network Engineers will be empowered to take advantage of the next wave of risk mitigation. We are finally at the cusp of networks having a level of self awareness where real-time analysis can determine the flows of application traffic and actions can be taken to enhance or destroy the transport of the applications data.
Large analyst firms are helping vendors sell new and exciting technology by building infographics showing statistics of how threats are internally born vs the “legacy” external, in-bound attacks. It’s true that we pretty much have the perimeter, Internet facing border, of our networks locked down with firewalls. It’s been that way for years. But some of that same technology that empowers our "Next Generation Firewalls” can be employed within the network because of the enhanced processing capability available today. We are seeing emerging tools which allow live analysis and risk profiling of devices and users which highlight potential internal threats as they emerge and enable action or further investigation. It’s these types of tools that will take us beyond simple authentication at the point of connection and static traffic rule-sets, to dynamic authorisation and access control.
Wi-Fi networks have had deep packet inspection capability now for a few years and have allowed Network Engineers to prioritise or block network applications. This type of capability is becoming more widely available, even across the entire campus network. We will see big data collection and increasingly smart machine-learning capabilities in this space. This will allow us to make better networks that are more finely tuned to the needs of our users and critically important, will help us to reduce risk.
We build and optimise networks. Continuous learning is our secret to being good. Along the learning journey we will share things here...