Magic Fractals



Introduction

Magic Fractals

This page describes a type of fractal derived from theNewton-Raphson method, which is more normally used as an approximatemethod of solving equations.

PDF On Feb 22, 2019, Maria Zdimalova and others published Magic with Fractals Find, read and cite all the research you need on ResearchGate. Dec 02, 2019 Magic Fractals & Shapes 3D (11) Released: Dec 2, 2019 Version: 3.7 Size: 82 MB Deal Alert. Get notified when this app is on sale or goes free privacy policy.

This is not a new idea to me; I was given the idea by a colleague atwork, and several other people have web pages about it too. I'mputting up yet another one because it contains some additions to theconcept which I haven't seen anywhere else.

Explanation

Newton-Raphson iteration should be familiar to anyone who hasstudied calculus; it's a method for finding roots of a functionby using the derivative of the function to improve an approximationto the root.

To perform a Newton-Raphson approximation, suppose you have a function$f(x)$, with derivative $f'(x)$, and you have an approximation $a$ toa root of the function. The Newton-Raphson procedure is to calculate$a'=a-f(a)/f'(a)$, which is a closer approximation to the root.Typically you would then iterate this again, and again, until thesuccessive values were extremely close together, at which point youwould conclude that you had a very good approximation to the actualvalue $r$ for which $f(r)=0$.

The Newton-Raphson method is useful in practice because of itsextremely fast convergence. The distance from the root to eachapproximation is roughly squared at each iteration; so assuming thedistance is already small enough that this makes it smaller ratherthan larger, you expect to double the number of correctdecimal digits in each approximation. So if you can find areasonably good approximation to begin with, Newton-Raphson can veryquickly give you an excellent one.

What the Newton-Raphson formula is essentially doing is drawing atangent to the curve at the point of the original approximation,then following that tangent to where it crosses the x-axis. Sinceany differentiable function looks close to a straight line whenviewed at sufficient magnification, this explains why it works sowell: the curve itself does not diverge from the tangent line byvery much, and the points where they cross the x-axis are very closeto each other. Hence, this technique massively improves an alreadygood approximation.

However, if you start with a really bad approximation, muchmore interesting things happen. Suppose the function curves up fromone intersection with the x-axis and back down to another, like aparabola; and suppose your initial approximation is somewhere near thetop of this arc. Now drawing a tangent to the curve and following itto the x-axis will land you a huge distance away from the roots of thefunction – and as your initial approximation $a$ crosses the maximumpoint of the curve, the Newton-Raphson second approximation $a'$ willflip from one side of the roots to the other. In fact, as $a$ movesthe relatively short distance across the maximum of the curve, $a'$will cover most of the real line.

This sort of behaviour, expanding a small area into a large one, isexactly the sort of behaviour we expect to give rise to self-similarfractals. So if we were to start a Newton-Raphson iteration at eachpoint on the real line, run each iteration until it converged towithin a given tolerance level of a root, and then colour thestarting point according to which root it ended up at, wemight well expect to see fractal shapes.

Fractals on one line are not very interesting, however; so let'swork in the complex plane. The Newton-Raphson iteration still worksperfectly well there, so that's where I'll be generating myfractals.

Illustration

Here's an example fractal, generated from the polynomial $z^4-1$, sothat the four roots of the function are at $-1$, $+1$, $-i$ and $+i$.

In this image, we see a large boring area surrounding each root ofthe function – as we would expect, since any point near a root willconverge rapidly to that root and do nothing interesting. Butbetween the areas of boring well-behaved convergence, wesee some beautiful fractal shapes. Let's zoom in on one of thoseboundary areas:

Just as we predicted – each of the heart-shaped blobs making up theboundary line is itself composed of boundary lines made up offurther heart-shaped blobs. This pattern is a true fractal.

Here's a rather different example. This time the function beingused is $(z-3)(z-2)(z-1)z(z+1)(z+2)(z+3)$, so it hasseven roots strung out in a long line:

In this case, the fractal shapes are much smaller compared to theoverall structure of the image. But they're not completely absent.If we zoom in on a couple of the little blobs on the boundary lines,we see this:

Each blob is divided up into coloured areas similar to thosecovering the whole plane, and on each dividing line we see moreblobs looking much the same as the larger blobs.

Magic Fractals

Decoration

These images are reasonably pretty, but they're a bit garish to beturned into desktop wallpaper in their current form. Is thereanything we can do to make them less stark?

Yes, there is. One obvious thing we can do, as well as noticingwhich root of the function the iteration ended up at, is to counthow many iterations it took to get there. We can then colour eachpixel a different shade of the colour assigned to that rootdepending on the number of iterations. So, using the obviousapproach of setting the pixel shade to the number of iterationsmodulo the number of available shades (so that each colour cyclesthrough those shades), we see something like this:

This is not only prettier, but it also shows us exactly whereeach root of the function is – instead of just knowing theroots are somewhere in the large coloured areas, we can nowpositively identify each root as the centre of the bright spot ineach area.

The cyclic behaviour is not quite optimal, though; it works wellenough if the number of available colours is limited, but it meansthere are sudden edges (like the ones at the very centre of eachregion) where a dark colour suddenly becomes a bright colour again.Perhaps if we have true colour available, it would be better to havethe pixel shading be monotonic – always getting darker the moreiterations are needed, but fading out by less and less and neveractually reaching blackness:

Now that's starting to look much nicer, I think. But it wouldbe even better if the visible boundaries between different shades ofthe same colour could be removed. I suspect that doing this rigorouslyrequires some really horrible maths and a lot of special cases, butI've found that a good ad-hoc approximation is obtained simply bylooking at the last iteration, in which the point first comes withinthe specified distance of a root. We look at the distance $D_0$ fromthe previous point to the root and the distance $D_1$ from the newpoint to the root, and we know that the threshold distance $T$ issomewhere in between the two. I've found that simply looking at$$frac{log T-log D_0}{log D_1-log D_0}$$ or, in other words,whether the log of the threshold radius was near to the start or theend of the inward distance travelled by the point (on a logarithmicscale), produces a perfectly acceptable result which we can use tosmooth out those boundaries:

Magic Fractals

Animation

Magic Fractals Tutorial

In order to write a program to generate these images, it's necessaryto know both the function being used (typically a polynomial) andthe exact locations of all its roots. Finding the exact roots of ageneral polynomial is not easy (cubics and quartics are just aboutsolvable, but quintics and beyond fall foul of Galois theory), so itmakes much more sense to start by deciding where we wantthe roots to be, and using that to compute the polynomial bymultiplying together a series of $(x-a)$ terms. This doesnot restrict the range of polynomials we can end up with, since inthe complex plane any polynomial can be fully factorised.

So the actual parameters you would pass to the fractal programconsist of the coordinates of a set of points, together with acolour for each point. This led to an interesting idea: suppose weimagine a small number of coloured points drifting gently around theplane, and at each instant of time we compute a Newton-Raphsonfractal for the current positions of the points. This should lead toa sequence of fractals which flow naturally on from each other, andas the bright central point in each coloured region moves, theregions move with them and the fractal phenomena on the regionboundaries swirl continuously.

Here is such an animation. To create this, I've set up three points,each moving along the Lissajous curve$(sin t,sin 2t)$,and each one third of the way further around the curve than thelast.

The image quality isn't amazing (due to the MPEG compression), butone feature that's just about visible is the lines of additionalbright spots moving within each coloured region, which appear tobecome the blobs on the connecting line when the other tworegions come together to squash the line of spots.

Evolution

Looking at the above rainbow-coloured plot with seven roots in aline, I found myself thinking that it's not very pretty, because thefractal blobs are so small. Why are they so small? Could anything bedone to make them bigger?

I speculated that perhaps the reason the blobs are so small(implying that Newton-Raphson converges particularly well for thisfunction) might be because polynomials of a high degree aregenerally very steep, and curve over very sharply at the top ofpeaks, so there isn't a large region of the plane in which thefractal self-similarity can be observed. This suggested than anobvious way to soften the process might be to reduce the overalldegree of the function: to make it more like $x^4$ than$x^7$. How to do that while preserving thelocations of the roots? Why, take the square root, of course!

Thinking about this a bit further, that does sound like a good idea.In the real numbers, the square root transformation squashes thingstowards the x-axis, and it squashes them more the furtheraway they are; so I would indeed expect it to turn a very sharpswitchback into a more gentle curve.

Moreover, it's trivially easy to do in the Newton-Raphson iteration.Suppose we have a function $f$, and we define a function $g$ to be$f^k$ for some (constant) power $k$. Then $g' = k f^{k-1} f'$; so whenwe compute $g/g'$, the $f^{k-1}$ term cancels out, and we are left withsimply $(1/k)(f/f')$. In other words, the sole effect of raising theentire function to the power $k$ is to multiply a constant factor of$1/k$ into the distance moved per iteration.

So this had to be worth a try. Here are the results from taking therainbow plot above, and raising the entire function to a variety ofoverall powers:

1.00.75
0.60.52

As I had hoped, the small fractal blobs have expanded into largerfractal blobs, which is good; and they've become more complex andspiky in shape as well, because the smaller fractal blobs onthem have also expanded. What I wasn't expecting, however,was that the dividing lines between the main convergence regionshave entirely changed shape, becoming more wiggly.

It turns out that actually taking the square root of the function(raising it to the power 0.5) is not feasible, since thealgorithm converges more and more slowly the closer to 0.5 you get.This is because raising the function to the power 0.5 causes us todouble the distance moved per iteration; so once we're near to a root,$f/f'$ gives you an accurate estimate of the distance to the root, andwe go twice as far, causing us to end up just as far awayfrom the root on the far side, so no wonder the iteration fails toconverge. 0.52 was the smallest value I could use without having toincrease my iteration limit.

This tendency to overshoot the root we're aiming at when using$k<1$ also offers an alternative intuitive explanationfor the fractal shapes becoming bigger. The fractal shapes arisebecause near to a boundary between convergence regions there is atendency for a single iteration step to travel a long distance intoremote regions of the complex plane; so if we're moving even furtherin a single iteration, then we would indeed expect not to have to beso close to the boundary before experiencing this phenomenon.

Raising the value of $k$ to something greaterthan 1 has the opposite effect: we now converge toward our root insmaller steps, more slowly but more surely, and so we're less likelyto overshoot and the fractal effects on the boundary lines becomeless pronounced. I'm not going to exhibit any pictures of that,because they're boring. However, I'll come back to this later.

As well as raising the entire polynomial to a power, it'salso interesting to see what happens if we try raising only one ofthe $(z-a)$ factors to a power. To do this we must firstthink about whether this can be efficiently implemented in software.

It can, it turns out. Observe that if you have a function made up ofthe product of a lot of smaller factors $f = abcd$, then the productrule gives its derivative as the sum of a list of terms $$f' = a'bcd +ab'cd + abc'd + abcd'$$ in which each term looks very similar to $f$itself. In fact we can divide both sides by $f$ to give us$$frac{f'}{f} = frac{a'}{a} + frac{b'}{b} + frac{c'}{c} +frac{d'}{d}$$ which is precisely the thing whose reciprocal wesubtract from $z$ in the N-R formula. Now when each factor $a$ is ofthe form $(z-k)$, then the resulting term $a'/a$ looks like $1/(z-k)$;and as we observed in the previous section, if the factor is raised toa power so that it's of the form $(z-k)^e$ then the resulting term$a'/a$ is only altered by a constant, so it becomes $e/(z-k)$.

So if we have a function which is the product of a number of terms ofthe form $(z-a_1)^{b_1}$, then we can efficiently perform itsNewton-Raphson iteration without ever computing the entire functionitself. We simply compute $b_1/(z-a_1)$ for each term, sum them, takethe reciprocal, and subtract that from $z$.

A useful point about this procedure is that it means we canconveniently raise each factor of our 'polynomial' (which isn't verypolynomial any more, really) to not only an arbitrary realpower, but to a complex power if we so wish. And it turnsout that we do so wish, because some very impressive fractals turnup if we do. Here's a map of 121 small fractal images, all generatedfrom functions with the same three roots, namely the complex rootsof 1. But the red root (1 itself) has been raised to a differentpower in each picture: across the map the real part of the powerruns from 0 to 1, and downwards the imaginary part runs from 0 to 1.

As mentioned above, raising a root to a power of 0.5 or lessinhibits convergence of the iteration to that root at all. But inthe presence of other roots to which we can still converge,the region of non-convergence – shown in black on the above map -forms complex and interesting fractal shapes. Meanwhile, applying animaginary part to the power of the red root causes a twistingeffect: the higher the power, the more the shape is less straightand more spirally; and it appears that it's the real partof the power which must be greater than 0.5 to manage to converge.

Some of the images in that map are well worth expanding to a largersize to admire in more detail. Here are four particularly good ones:

$0.5$$0.5+0.3i$
$i$$0.4+0.9i$

Imitation

If you look again at the images in the above section, you may noticethat some of the shapes – particularly the ones that occur whenthings are raised to real powers near 0.5 – are beginning to lookinterestingly like the shape of the Mandelbrot set. This suggestedto me that there might be some sort of close relationship betweenthese fractals and Mandelbrot/Julia sets.

So an obvious question to ask along these lines is, is there afunction to which we could apply the Newton-Raphson formula $z mapstoz - frac{f(z)}{f'(z)}$ and end up with the Julia set iteration $zmapsto z^2+c$?

Well, if you write down the equation $z - frac{f(z)}{f'(z)} = z^2+c$,or equivalently $z - frac{f}{df/dz} = z^2+c$, then it looks a lotlike a differential equation. If only we were working in the positivereals instead of the complex numbers, we could solve it quite easilyby separating variables into $df/f = -dz/(z^2-z+c)$, factorising thedenominator of the RHS and turning it into partial fractions, andintegrating to get $f = (z-a)^{1/(b-a)}(z-b)^{1/(a-b)}$, where $a$ and$b$ are the roots of the quadratic $z^2-z+c$. (Unless $c=frac14$,which is a special case arising from $z^2-z+c$ being a perfect squareand looks rather different; but I'll ignore that, because it's not avery interesting Julia set anyway.)

And it turns out that even in the complex numbers this answer worksplausibly. The function $(z-a)^{1/(b-a)}(z-b)^{1/(a-b)}$ is ofprecisely the form discussed in the previous section: it's the productof linear terms raised to arbitrary powers. Hence we can convenientlydo the Newton-Raphson iteration for this function by the methoddescribed above: compute $frac1{(b-a)(z-a)} + frac1{(a-b)(z-b)}$ andsubtract its reciprocal from $z$. If you do that, remembering that byconstruction $a+b=1$ and $ab=c$, you will indeed find that a lot ofalgebraic mess cancels out and you end up with the iteration $zmapsto z^2+c$.

So there is a family of functions to which the application of theNewton-Raphson formula yields a Julia set iteration, and moreoverthose functions aren't very far away from the ones we've alreadybeen considering here. To prove it works, here are a couple of plotsof Julia sets, with their corresponding Newton-Raphson plot:

Julia set for $0.28+0.528i$Julia set for $-0.656+0.272i$
Newton-Raphson fractal for
$(z-(0.9994-0.5286i))^{-0.4722-0.4998i} times$$(z-(0.0006+0.5286i))^{0.4722+0.4998i}$
Newton-Raphson fractal for
$(z-(1.4623-0.1413i))^{-0.5086-0.0747i} times$$(z-(-0.4623+0.1413i))^{0.5086+0.0747i}$

You will have noticed, of course, that the colouring is verydifferent. The two types of fractal are using matching iterationformulae, but a Julia set plotter concentrates on how long it takesthe iteration value to land outside a critical circle, whereas theNewton-Raphson plotter is actually waiting for the iterates toconverge to a point, and is only incidentally observing what happenswhen they don't. So you wouldn't actually want to throw away yourdedicated Julia-set plotting programs; but it's interesting,nonetheless, that something very like Julia sets are a special caseof Newton-Raphson fractals.

Dissection

Another noticeable thing about some of the above fractals is that somehave much more fractal content than others. The original fractal atthe top of this page with roots at $-1$, $+1$, $-i$, $+i$ has four bigregions of flat colour meeting at the origin, and the self-similarnature of the fractal causes that quadruple contact point to bereplicated in other parts of the plane. This gives rise toqualitatively more interesting fractal phenomena than the plot withseven roots in a line, in which all the large convergence regions areseparated by boring curves which never meet, and the fractal blobsreflect this structure.

So I wondered, is there a way that we can predict how theconvergence regions are going to be shaped, and thereby constructpolynomials which have triple or quadruple meeting points exactlywhere we want them?

Yes, as it turns out, there is.

I observed above that raising a polynomial to a real power greaterthan 1 has the effect of causing the iteration to take smaller stepstowards the root, which in turn causes convergence to be slower butsurer and reduces the incidence of overshoot leading to fractalphenomena. As a result of this, the boundary lines between regionsbecome clearer and smoother and simpler. This seemed like a goodthing if we wanted to know the overall structure of the fractal; butmerely clearer isn't good enough. I wanted clearest.

So I wondered what would happen 'in the limit', as the iterationspeed slowed down more and more. In other words, I was interested inwhat I'm going to describe as continuous Newton-Raphson,in which you start your 'iteration' at an arbitrary point$z_0$ on the plane and then let $z$ evolvecontinuously according to the differential equation$frac{dz}{dt} = -frac{f(z)}{f'(z)}$.

As in the previous section, we can 'solve' this equation by separatingvariables and naïvely integrating, trying hard to ignore thequestion of whether this is rigorous or even meaningful in the complexnumbers. This time we end up with $frac1ffrac{df}{dz}dz = -dt$, andthen we can apply the integration-by-substitution formula to the LHSto give us $frac{df}{f} = -dt$, which integrates to give us $f(z) =Ae^{-t}$ for some constant $A$. We can check by differentiating withrespect to $t$ that this does indeed turn out to be a plausible answerto the equation even in the complex numbers: the chain rule tells us$frac{d}{dt} f(z) = f'(z) frac{dz}{dt}$, so we end up with $f'(z)frac{dz}{dt} = -Ae^{-t} = -f(z)$ as required. And the constant $A$ isclearly equal to $f(z_0)$, the value of $f$ at the point where westarted our integration.

'Very nice', I hear you protest, 'but what does that mean?'

Well, since $t$ is a positive real, it means that whateverpath $z$ follows from our starting point $z_0$must have the property that at all times $f(z)$ is equalto $k f(z_0)$, where $k = Ae^{-t}$ is a realvalue which continuously and monotonically decreases from 1 toward0. So we normally expect the limit of such an 'iteration' to be apoint at which $f(z)$ is actually equal to zero, i.e. aroot of $f$.

But that's not the only thing that can happen. It's also possible toimagine that we might encounter a point along this path at whichthere isn't a clear direction we should head in: either there is nodirection in which we can head from $z$ which will cause$f(z)$ to continue decreasing as a real multiple of itsinitial value, or perhaps there is more than one such direction.

How do we find such points? Well, for most points in the complexplane the differential equation we started with will usually give usa unique direction in which we can head to linearly decrease$f(z)$: we compute $f(z)/f'(z)$, and head inthe direction pointed to by that complex number. This fails if$f(z)=0$, in which case there is no clear direction tohead in; we expect that, of course, because if $f(z)=0$then we've already reached a root. But it can also fail if$f'(z)=0$. So this suggests that the roots of thederivative of $f$ might be worth investigating.

So consider some root $r$ of $f'$, at which$f$ itself is non-zero. If we can find any continuouspaths in the complex plane which start at $r$ and have$f$ increasing as a real multiple of$f(r)$, then any point on one of those paths will be apoint at which starting a continuous Newton-Raphson process willcause it to head backwards along the same path and terminate at$r$ rather than at a root. In other words, we expectthose paths to be precisely the boundaries between theconvergence regions for the various roots (since that's the obviousset of points which we expect not to converge sensibly to a root);and moreover, on every such boundary we expect to find aroot of $f'$ (because a continuous Newton-Raphson processstarted on any boundary has to end up somewhere).

That's a lot of maths to endure without a break for a prettypicture, and it's also a long and rather handwavey chain ofreasoning to endure without some sort of reassurance that what I'msaying still makes sense. So, here I present a sample polynomialNewton-Raphson fractal, with the roots of the polynomial'sderivative marked as black blobs. Observe that each dividing linebetween convergence regions has a blob somewhere on it, and that inparticular the point where three regions (and three such lines) meethas a blob.

So now we know, at least in theory, how to find the lines dividingthe different convergence regions. Now, what distinguishes a simpledividing line, separating only two regions, from a point atwhich three or more regions meet?

Let's consider our root $r$ of $f'$ again. We're now interested in howmany directions we can head away from $r$ in, such that $f$increases as a real multiple of $f(r)$. Equivalently, we're interestedin directions we can head in such that $f(z)=(1+k)f(r)$ for some realpositive $k$; in other words, we want small values $epsilon$ suchthat $f(r+epsilon)/f(r)-1$ is real and positive.

So now let's consider the Taylor series expansion for the function$f(r+epsilon)/f(r)-1$. We have $$f(r+epsilon) = f(r) + epsilonf'(r) + epsilon^2 frac{f'(r)}{2!} + epsilon^3 frac{f''(r)}{3!} +cdots$$ and we know that $f'(r) = 0$ by construction, so this givesus $$frac{f(r+epsilon)}{f(r)}-1 = epsilon^2 frac{f'(r)}{2!f(r)} +epsilon^3 frac{f''(r)}{3!f(r)} + cdots$$

In the usual case, $f'(r)$ will be non-zero, so for small values of$epsilon$, $f(r+epsilon)/f(r)-1$ will be approximately equal to$Kepsilon^2$ for some (complex) constant $K$. This means that weexpect to have two directions in which we can head in orderto make $f(r+epsilon)/f(r)-1$ real and positive: one directioncorresponding to each square root of $1/K$. This fits with what weexpect, because in the usual case a root of $f'$ gives rise to aregion boundary extending away from it in two opposite directions.

But if $f'(r)$ is zero as well, then the $epsilon^3$ term will nowbe the dominating one in the Taylor expansion; so for small values of$epsilon$, $f(r+epsilon)/f(r)-1$ will be approximately equal to$Kepsilon^3$ for some $K$. And now we expect to havethree region boundaries coming out of our root $r$, one foreach cube root of $1/K$. If the third derivative of $f$ is zero at $r$too, then we can have four region boundaries, and so on.

And there it is: that's the result I've been working towards for thisentire section. A multiple-region meeting point occurs when a root $r$of $f'$ is also a root of $f'$, and the more higher derivatives arezero at $r$ the more regions meet there. For polynomials inparticular, this translates into $r$ being a repeated root of$f'$, and the more repeated the merrier.

To demonstrate beyond reasonable doubt that this actually works, Iwill now construct from first principles a Newton-Raphson fractalcontaining two points at each of which five regions meet. Fora five-way meeting point we need $f'$ to have a four-times-repeatedroot; so let's set $f'(z)=(z-1)^4(z+1)^4$, which has two repeatedroots as desired. We 'integrate' this in the naïve way (actuallywhat we're doing is finding an anti-derivative of it, integration inthe complex plane being a generally messy concept) to obtain thenasty-looking polynomial $(35z^9 - 180z^7 + 378z^5 - 420z^3 +315z)/315$. We can discard the constant factor of $1/315$ since itmakes no difference to the convergence, and this gives us $35z^9 -180z^7 + 378z^5 - 420z^3 + 315z$. We find the roots of this (usingNewton-Raphson for its more conventional purpose) and feed them intoour fractal plotter, and get this picture:

And, exactly as we asked for, this has two five-way meeting points atlocations $+1$ and $-1$. Bingo!

Of course, integration lets us add an arbitrary constant termwithout affecting the derivative of our result. Here's what we getif we add a constant term of $50+200i$ to the abovepolynomial:

The roots of the polynomial have moved around, and the picture isdistorted, but the region meeting points are still exactly where weasked for them.

Oscillation

Here's another curiosity. The following fractal is plotted using apure polynomial $f$, placing the roots at $+i$, $-i$, $-2.3$ and$+2.3$:

The interesting feature of this picture is the black areas, whichare regions in which Newton-Raphson failed to converge to a rootwithin the program's limit of 256 iterations. At first sight onemight assume that this is an artifact of having a finite iterationlimit at all, but one would (it turns out) be wrong. Even if youcrank up the iteration limit to a much larger number, those blackareas stay black, because they represent genuine non-convergence. Ms office for mac free trial download.

What's actually happening here is that for this particularpolynomial the Newton-Raphson method gives rise to period-2 cyclicbehaviour. There's a pair of points $a$ and$b$, one in the middle of each of the two main blackareas, which have the property that a single iteration ofNewton-Raphson starting from $a$ takes you to$b$, and vice versa. But this is more than just a cyclebetween those two points: it's an attracting cycle. If youstart from a point somewhere near $a$, aniteration of Newton-Raphson will take you to somewhere evennearer to $b$, from which another iteration willland you nearer still to $a$ again. Hence, there's asizeable region around each point of the cycle which all convergesto the cycle itself, and hence never settles down to a root of thepolynomial.

This is interesting to me because it happens so rarely; I've plottedquite a lot of these fractals, including a lot of animated ones inwhich the points wander continuously around the plane, and thisparticular arrangement of roots is the only case I'vefound in which a polynomial gives rise to a non-converging area.(It's not the only actual polynomial: you can obviouslyrotate or translate the root positions to obtain other polynomialswith the same behaviour, and you can also move the roots around by asmall amount relative to each other before the black areas go away.But this general pattern – four roots in a diamond layout ofroughly this shape – is the only pattern I know of thatdoes this.)

On one level, it's reasonably easy to see 'why' it happens.If you define $g(x) = x - f(x)/f'(x)$ to be theNewton-Raphson iteration for this polynomial, then find roots of theequation $g(g(x))=x$ (probably using a computer algebrapackage, since it's a pretty ugly mess) you will observe that it hasroots which are not also roots of the similar first-order equation$g(x)=x$ (and hence of $f$). That proves thatthere are cycles at all; then, to prove there areattracting cycles, compute the derivative of$G(x)=g(g(x))$ at those roots and find the ones where it hasmodulus less than 1 (since for small $h$,$G(r+h)$ will be approximately $G(r)+G'(r)h$,and if $G(r) = r$ and $|G'(r)| < 1$ thenthis will be closer to $r$ than we started out).

But that doesn't really explain the phenomenon, or predictwhat other sorts of polynomial will give rise to cyclic behaviour ofthis type. Do there exist polynomials which exhibitperiod-$n$ cycles for all $n > 1$, forexample? How often does this happen? How would one go aboutconstructing polynomials with non-converging regions to order?

I don't currently know the answers to these questions, but I'd beinterested to hear them if anyone else does.

Participation

If you want to generate some of these fractal images yourself, you candownload a program to generate them here: C sourcecode and precompiled Windows executable.The program will produce PNG, Windows .BMP, or PPM files, and imageprocessing software should be able to convert those to other formatsif you prefer. Planet coaster download for mac.

Just typing 'newton' should give some help about whatall the command-line options do. If you want a quick start, here's aselection of sample command lines you might like to try:

  • newton -o simple.png -s 256x256 -x 2 -c1,0,0:1,1,0:0,0.7,0:0,0.5,1 -- -1 +1 -i +i
    This one is the cross-shaped plot I used at the top of thispage, based on the polynomial $z^4-1$. To zoom in on oneof the boundary lines as I did above, replace '-x 2'with '-x 0.4 -X 1.05 -Y 1.05'.
  • newton -o rainbow.png -s 320x256 -x 5 -c1,0,0:1,0.7,0:1,1,0:0,1,0:0,0.7,1:0,0,1:0.5,0,1 -- -3 -2 -1 0 1 23
    This one is the rainbow-coloured plot with seven roots strung out ina long line.
  • newton -o powermap.png -s 256x256 -x 2 -c 1,0,0:1,1,0:0,1,0 --1/power-0.5-0.8660254i -0.5+0.8660254i
    This command plots the various images from the large map shownabove, with the red root raised to an arbitrary power. Replacepower with the power you want; for example, replace it with0.5 to get the Mandelbrot-like picture, and with0.4+0.9i to get the disconnected yellow-and-green plot.
  • newton -o holes.png -s 320x256 -y 2 -c1,0,0:1,1,0:0,0.7,0:0,0.5,1 -- -2.3 +2.3 -i +i
    This one is the plot with black non-converging areas.
  • By default the program will use the cyclic shading behaviour with 16shades of each colour. You can specify -C no to turnoff cyclic shading (so the colours get uniformly darker as moreiterations are needed), -B yes to turn on blurring ofthe iteration boundaries, and -f 32 or -f64 if you want to increase the number of shades used.
  • To raise the function to an overall power, use the -poption, for example -p 0.52. This will slow downconvergence, so you will probably also need to increase the numberof shades of colour (the -f option, as discussed above)in order to make the result look nice.
  • To raise an individual root to a power other than 1, put a slashafter that root followed by the power value. For example, anargument of 2+i/1-0.5i specifies a factor of$(x-(2+i))^{1-0.5i}$.
  • If you want a larger version of an image, just change the picturesize specified in the -s option.
  • The program has a number of other options; just typenewton on its own to list them.

Here are some pre-generated larger versions of the above images:

Recognition

Magic Fractals App

Newton-Raphson fractals are not a new idea of mine, although mostother pages I've seen don't go so far into the maths. Here are a fewsuch pages; you can probably find more by googling for'Newton-Raphson fractal'.

  • AlanDonovan was the person who introduced me to these fractals inthe first place.
  • TomWomack also thought of animating them, although his pageunfortunately doesn't include any actual movie files.
  • PaulBourke has some particularly well-chosen and pretty images.
  • Bryan Krofchok has amore varied – and more colourful – gallery.
  • MathWorld'spage on the Newton-Raphson method itself mentions its fractalproperty, and has some small examples and further references.
(comments to anakin@pobox.com)
(thanks tochiarkfor hosting this page)
Magic

Magic Fractals App

(last modified on Fri Feb 17 14:02:09 2017)